Test Report: Hyperkit_macOS 17174

                    
                      7689d73509a567ada6f3653fa0ef2156acc9a338:2023-09-06:30902
                    
                

Test fail (9/300)

x
+
TestForceSystemdEnv (20.71s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-420000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:155: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p force-systemd-env-420000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : exit status 90 (15.10344649s)

                                                
                                                
-- stdout --
	* [force-systemd-env-420000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=true
	* Using the hyperkit driver based on user configuration
	* Starting control plane node force-systemd-env-420000 in cluster force-systemd-env-420000
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 17:05:42.187451    4382 out.go:296] Setting OutFile to fd 1 ...
	I0906 17:05:42.187664    4382 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 17:05:42.187671    4382 out.go:309] Setting ErrFile to fd 2...
	I0906 17:05:42.187675    4382 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 17:05:42.187857    4382 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 17:05:42.189542    4382 out.go:303] Setting JSON to false
	I0906 17:05:42.209502    4382 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":2115,"bootTime":1694043027,"procs":409,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 17:05:42.209590    4382 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 17:05:42.233135    4382 out.go:177] * [force-systemd-env-420000] minikube v1.31.2 on Darwin 13.5.1
	I0906 17:05:42.274938    4382 out.go:177]   - MINIKUBE_LOCATION=17174
	I0906 17:05:42.254044    4382 notify.go:220] Checking for updates...
	I0906 17:05:42.316944    4382 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 17:05:42.358743    4382 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 17:05:42.400848    4382 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 17:05:42.442872    4382 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 17:05:42.484821    4382 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=true
	I0906 17:05:42.507057    4382 config.go:182] Loaded profile config "offline-docker-615000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 17:05:42.507235    4382 driver.go:373] Setting default libvirt URI to qemu:///system
	I0906 17:05:42.536938    4382 out.go:177] * Using the hyperkit driver based on user configuration
	I0906 17:05:42.578971    4382 start.go:298] selected driver: hyperkit
	I0906 17:05:42.578984    4382 start.go:902] validating driver "hyperkit" against <nil>
	I0906 17:05:42.579010    4382 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 17:05:42.581807    4382 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:05:42.581912    4382 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17174-977/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 17:05:42.589395    4382 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.31.2
	I0906 17:05:42.593169    4382 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 17:05:42.593189    4382 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 17:05:42.593220    4382 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0906 17:05:42.593408    4382 start_flags.go:904] Wait components to verify : map[apiserver:true system_pods:true]
	I0906 17:05:42.593435    4382 cni.go:84] Creating CNI manager for ""
	I0906 17:05:42.593448    4382 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0906 17:05:42.593457    4382 start_flags.go:316] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0906 17:05:42.593469    4382 start_flags.go:321] config:
	{Name:force-systemd-env-420000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.1 ClusterName:force-systemd-env-420000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local C
ontainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 17:05:42.593641    4382 iso.go:125] acquiring lock: {Name:mk785f5a651fb55e13065a70647b69ec2c0160e8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:05:42.614944    4382 out.go:177] * Starting control plane node force-systemd-env-420000 in cluster force-systemd-env-420000
	I0906 17:05:42.656855    4382 preload.go:132] Checking if preload exists for k8s version v1.28.1 and runtime docker
	I0906 17:05:42.656903    4382 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4
	I0906 17:05:42.656917    4382 cache.go:57] Caching tarball of preloaded images
	I0906 17:05:42.657034    4382 preload.go:174] Found /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0906 17:05:42.657045    4382 cache.go:60] Finished verifying existence of preloaded tar for  v1.28.1 on docker
	I0906 17:05:42.657138    4382 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/force-systemd-env-420000/config.json ...
	I0906 17:05:42.657159    4382 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/force-systemd-env-420000/config.json: {Name:mk75814f796b650b9d6b9881a8275c204d50cda5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 17:05:42.657449    4382 start.go:365] acquiring machines lock for force-systemd-env-420000: {Name:mkf8413c64a17b8d5adfa32fa8b277d511d8f398 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 17:05:42.657505    4382 start.go:369] acquired machines lock for "force-systemd-env-420000" in 40.955µs
	I0906 17:05:42.657526    4382 start.go:93] Provisioning new machine with config: &{Name:force-systemd-env-420000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 K
ubernetesConfig:{KubernetesVersion:v1.28.1 ClusterName:force-systemd-env-420000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker
BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 17:05:42.657570    4382 start.go:125] createHost starting for "" (driver="hyperkit")
	I0906 17:05:42.679022    4382 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	I0906 17:05:42.679257    4382 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 17:05:42.679295    4382 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 17:05:42.686428    4382 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:52185
	I0906 17:05:42.686785    4382 main.go:141] libmachine: () Calling .GetVersion
	I0906 17:05:42.687227    4382 main.go:141] libmachine: Using API Version  1
	I0906 17:05:42.687238    4382 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 17:05:42.687432    4382 main.go:141] libmachine: () Calling .GetMachineName
	I0906 17:05:42.687520    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetMachineName
	I0906 17:05:42.687598    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:42.687693    4382 start.go:159] libmachine.API.Create for "force-systemd-env-420000" (driver="hyperkit")
	I0906 17:05:42.687722    4382 client.go:168] LocalClient.Create starting
	I0906 17:05:42.687761    4382 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem
	I0906 17:05:42.687800    4382 main.go:141] libmachine: Decoding PEM data...
	I0906 17:05:42.687818    4382 main.go:141] libmachine: Parsing certificate...
	I0906 17:05:42.687873    4382 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem
	I0906 17:05:42.687901    4382 main.go:141] libmachine: Decoding PEM data...
	I0906 17:05:42.687910    4382 main.go:141] libmachine: Parsing certificate...
	I0906 17:05:42.687923    4382 main.go:141] libmachine: Running pre-create checks...
	I0906 17:05:42.687933    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .PreCreateCheck
	I0906 17:05:42.688031    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:42.688208    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetConfigRaw
	I0906 17:05:42.700045    4382 main.go:141] libmachine: Creating machine...
	I0906 17:05:42.700057    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .Create
	I0906 17:05:42.700220    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:42.700437    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | I0906 17:05:42.700222    4390 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 17:05:42.700505    4382 main.go:141] libmachine: (force-systemd-env-420000) Downloading /Users/jenkins/minikube-integration/17174-977/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/17174-977/.minikube/cache/iso/amd64/minikube-v1.31.0-1692872107-17120-amd64.iso...
	I0906 17:05:42.923780    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | I0906 17:05:42.923716    4390 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/id_rsa...
	I0906 17:05:42.985699    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | I0906 17:05:42.985604    4390 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/force-systemd-env-420000.rawdisk...
	I0906 17:05:42.985720    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Writing magic tar header
	I0906 17:05:42.985738    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Writing SSH key tar header
	I0906 17:05:42.986271    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | I0906 17:05:42.986235    4390 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000 ...
	I0906 17:05:43.373723    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:43.373744    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/hyperkit.pid
	I0906 17:05:43.373819    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Using UUID 49222c5a-4d12-11ee-bc96-149d997cd0f1
	I0906 17:05:43.391242    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Generated MAC 76:4a:f6:89:c1:bc
	I0906 17:05:43.391283    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-420000
	I0906 17:05:43.391414    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"49222c5a-4d12-11ee-bc96-149d997cd0f1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000182390)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(
nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 17:05:43.391467    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"49222c5a-4d12-11ee-bc96-149d997cd0f1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000182390)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/initrd", Bootrom:"", CPUs:2, Memory:2048, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(
nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 17:05:43.391537    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/hyperkit.pid", "-c", "2", "-m", "2048M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "49222c5a-4d12-11ee-bc96-149d997cd0f1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/force-systemd-env-420000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/tty,log=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-e
nv-420000/bzimage,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-420000"}
	I0906 17:05:43.391608    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/hyperkit.pid -c 2 -m 2048M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 49222c5a-4d12-11ee-bc96-149d997cd0f1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/force-systemd-env-420000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/tty,log=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/console-ring -f kexec,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/bzimage,/Users/jenkins/minikube-integration/17174-97
7/.minikube/machines/force-systemd-env-420000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=force-systemd-env-420000"
	I0906 17:05:43.391627    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0906 17:05:43.394820    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 DEBUG: hyperkit: Pid is 4391
	I0906 17:05:43.395224    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Attempt 0
	I0906 17:05:43.395242    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:43.395367    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:43.396319    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Searching for 76:4a:f6:89:c1:bc in /var/db/dhcpd_leases ...
	I0906 17:05:43.396404    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0906 17:05:43.396421    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:05:43.396499    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:05:43.396525    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:05:43.396542    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:05:43.396564    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:05:43.396577    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:05:43.396586    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:05:43.396595    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:05:43.396608    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:05:43.396615    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:05:43.396627    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:05:43.396636    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:05:43.396644    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:05:43.396653    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:05:43.396661    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:05:43.396671    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:05:43.396678    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:05:43.401444    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0906 17:05:43.408802    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0906 17:05:43.409562    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 17:05:43.409585    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 17:05:43.409599    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 17:05:43.409616    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 17:05:43.772491    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0906 17:05:43.772507    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0906 17:05:43.876692    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 17:05:43.876731    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 17:05:43.876764    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 17:05:43.876787    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 17:05:43.877636    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0906 17:05:43.877647    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:43 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0906 17:05:45.397410    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Attempt 1
	I0906 17:05:45.397428    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:45.397517    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:45.398369    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Searching for 76:4a:f6:89:c1:bc in /var/db/dhcpd_leases ...
	I0906 17:05:45.398468    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0906 17:05:45.398519    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:05:45.398535    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:05:45.398545    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:05:45.398558    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:05:45.398568    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:05:45.398593    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:05:45.398626    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:05:45.398635    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:05:45.398643    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:05:45.398651    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:05:45.398660    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:05:45.398674    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:05:45.398697    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:05:45.398727    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:05:45.398741    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:05:45.398794    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:05:45.398807    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:05:47.398849    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Attempt 2
	I0906 17:05:47.398871    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:47.398951    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:47.399789    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Searching for 76:4a:f6:89:c1:bc in /var/db/dhcpd_leases ...
	I0906 17:05:47.399852    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0906 17:05:47.399864    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:05:47.399879    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:05:47.399891    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:05:47.399903    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:05:47.399936    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:05:47.399961    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:05:47.399973    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:05:47.399983    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:05:47.399993    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:05:47.400004    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:05:47.400014    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:05:47.400022    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:05:47.400032    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:05:47.400040    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:05:47.400050    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:05:47.400058    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:05:47.400065    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:05:49.026301    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:49 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0906 17:05:49.026345    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:49 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0906 17:05:49.026359    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | 2023/09/06 17:05:49 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0906 17:05:49.400029    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Attempt 3
	I0906 17:05:49.400043    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:49.400149    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:49.400974    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Searching for 76:4a:f6:89:c1:bc in /var/db/dhcpd_leases ...
	I0906 17:05:49.401031    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0906 17:05:49.401045    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:05:49.401079    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:05:49.401096    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:05:49.401108    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:05:49.401121    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:05:49.401130    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:05:49.401139    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:05:49.401147    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:05:49.401154    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:05:49.401191    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:05:49.401209    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:05:49.401220    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:05:49.401231    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:05:49.401256    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:05:49.401273    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:05:49.401281    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:05:49.401291    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:05:51.403029    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Attempt 4
	I0906 17:05:51.403047    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:51.403165    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:51.403963    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Searching for 76:4a:f6:89:c1:bc in /var/db/dhcpd_leases ...
	I0906 17:05:51.404035    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found 17 entries in /var/db/dhcpd_leases!
	I0906 17:05:51.404067    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:05:51.404081    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:05:51.404089    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:05:51.404097    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:05:51.404104    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:05:51.404112    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:05:51.404122    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:05:51.404139    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:05:51.404148    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:05:51.404156    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:05:51.404165    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:05:51.404181    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:05:51.404190    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:05:51.404208    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:05:51.404221    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:05:51.404235    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:05:51.404245    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:05:53.405271    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Attempt 5
	I0906 17:05:53.405291    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:53.405382    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:53.406295    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Searching for 76:4a:f6:89:c1:bc in /var/db/dhcpd_leases ...
	I0906 17:05:53.406362    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found 18 entries in /var/db/dhcpd_leases!
	I0906 17:05:53.406399    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:05:53.406413    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | Found match: 76:4a:f6:89:c1:bc
	I0906 17:05:53.406420    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | IP: 192.168.64.19
	I0906 17:05:53.406465    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetConfigRaw
	I0906 17:05:53.407206    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:53.407312    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:53.407453    4382 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0906 17:05:53.407483    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetState
	I0906 17:05:53.407654    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:05:53.407782    4382 main.go:141] libmachine: (force-systemd-env-420000) DBG | hyperkit pid from json: 4391
	I0906 17:05:53.408713    4382 main.go:141] libmachine: Detecting operating system of created instance...
	I0906 17:05:53.408727    4382 main.go:141] libmachine: Waiting for SSH to be available...
	I0906 17:05:53.408734    4382 main.go:141] libmachine: Getting to WaitForSSH function...
	I0906 17:05:53.408740    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.408837    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:53.408998    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.409113    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.409241    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:53.409439    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:53.409869    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:53.409880    4382 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0906 17:05:53.483297    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0906 17:05:53.483332    4382 main.go:141] libmachine: Detecting the provisioner...
	I0906 17:05:53.483339    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.483489    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:53.483654    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.483797    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.483963    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:53.484235    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:53.484642    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:53.484666    4382 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0906 17:05:53.557095    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g88b5c50-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I0906 17:05:53.557187    4382 main.go:141] libmachine: found compatible host: buildroot
	I0906 17:05:53.557208    4382 main.go:141] libmachine: Provisioning with buildroot...
	I0906 17:05:53.557238    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetMachineName
	I0906 17:05:53.557377    4382 buildroot.go:166] provisioning hostname "force-systemd-env-420000"
	I0906 17:05:53.557392    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetMachineName
	I0906 17:05:53.557534    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.557675    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:53.557850    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.557981    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.558075    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:53.558211    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:53.558525    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:53.558535    4382 main.go:141] libmachine: About to run SSH command:
	sudo hostname force-systemd-env-420000 && echo "force-systemd-env-420000" | sudo tee /etc/hostname
	I0906 17:05:53.635591    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: force-systemd-env-420000
	
	I0906 17:05:53.635647    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.635785    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:53.635869    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.635955    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.636030    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:53.636210    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:53.636571    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:53.636584    4382 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sforce-systemd-env-420000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 force-systemd-env-420000/g' /etc/hosts;
				else 
					echo '127.0.1.1 force-systemd-env-420000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0906 17:05:53.710009    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0906 17:05:53.710030    4382 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/17174-977/.minikube CaCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/17174-977/.minikube}
	I0906 17:05:53.710060    4382 buildroot.go:174] setting up certificates
	I0906 17:05:53.710093    4382 provision.go:83] configureAuth start
	I0906 17:05:53.710102    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetMachineName
	I0906 17:05:53.710295    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetIP
	I0906 17:05:53.710418    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.710515    4382 provision.go:138] copyHostCerts
	I0906 17:05:53.710555    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem -> /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem
	I0906 17:05:53.710606    4382 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem, removing ...
	I0906 17:05:53.710616    4382 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem
	I0906 17:05:53.710790    4382 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem (1078 bytes)
	I0906 17:05:53.710986    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem -> /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem
	I0906 17:05:53.711018    4382 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem, removing ...
	I0906 17:05:53.711023    4382 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem
	I0906 17:05:53.711138    4382 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem (1123 bytes)
	I0906 17:05:53.711289    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem -> /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem
	I0906 17:05:53.711323    4382 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem, removing ...
	I0906 17:05:53.711327    4382 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem
	I0906 17:05:53.711405    4382 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem (1679 bytes)
	I0906 17:05:53.711542    4382 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca-key.pem org=jenkins.force-systemd-env-420000 san=[192.168.64.19 192.168.64.19 localhost 127.0.0.1 minikube force-systemd-env-420000]
	I0906 17:05:53.875631    4382 provision.go:172] copyRemoteCerts
	I0906 17:05:53.875699    4382 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0906 17:05:53.875719    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.875876    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:53.875964    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.876051    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:53.876152    4382 sshutil.go:53] new ssh client: &{IP:192.168.64.19 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/id_rsa Username:docker}
	I0906 17:05:53.917403    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0906 17:05:53.917478    4382 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0906 17:05:53.933370    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0906 17:05:53.933439    4382 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem --> /etc/docker/server.pem (1249 bytes)
	I0906 17:05:53.949607    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0906 17:05:53.949682    4382 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0906 17:05:53.966142    4382 provision.go:86] duration metric: configureAuth took 256.021096ms
	I0906 17:05:53.966175    4382 buildroot.go:189] setting minikube options for container-runtime
	I0906 17:05:53.966340    4382 config.go:182] Loaded profile config "force-systemd-env-420000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 17:05:53.966354    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:53.966540    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:53.966634    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:53.966712    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.966815    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:53.966905    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:53.967017    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:53.967347    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:53.967375    4382 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0906 17:05:54.035238    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0906 17:05:54.035250    4382 buildroot.go:70] root file system type: tmpfs
	I0906 17:05:54.035339    4382 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0906 17:05:54.035355    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:54.035479    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:54.035572    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.035667    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.035785    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:54.035919    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:54.036229    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:54.036278    4382 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0906 17:05:54.111511    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0906 17:05:54.111560    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:54.111694    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:54.111791    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.111891    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.111974    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:54.112114    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:54.112423    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:54.112435    4382 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0906 17:05:54.627696    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0906 17:05:54.627712    4382 main.go:141] libmachine: Checking connection to Docker...
	I0906 17:05:54.627733    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetURL
	I0906 17:05:54.627901    4382 main.go:141] libmachine: Docker is up and running!
	I0906 17:05:54.627910    4382 main.go:141] libmachine: Reticulating splines...
	I0906 17:05:54.627915    4382 client.go:171] LocalClient.Create took 11.94033482s
	I0906 17:05:54.627964    4382 start.go:167] duration metric: libmachine.API.Create for "force-systemd-env-420000" took 11.940398731s
	I0906 17:05:54.627973    4382 start.go:300] post-start starting for "force-systemd-env-420000" (driver="hyperkit")
	I0906 17:05:54.627983    4382 start.go:329] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0906 17:05:54.627993    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:54.628193    4382 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0906 17:05:54.628208    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:54.628321    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:54.628416    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.628573    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:54.628718    4382 sshutil.go:53] new ssh client: &{IP:192.168.64.19 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/id_rsa Username:docker}
	I0906 17:05:54.667652    4382 ssh_runner.go:195] Run: cat /etc/os-release
	I0906 17:05:54.670415    4382 info.go:137] Remote host: Buildroot 2021.02.12
	I0906 17:05:54.670426    4382 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17174-977/.minikube/addons for local assets ...
	I0906 17:05:54.670516    4382 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17174-977/.minikube/files for local assets ...
	I0906 17:05:54.670679    4382 filesync.go:149] local asset: /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem -> 14382.pem in /etc/ssl/certs
	I0906 17:05:54.670686    4382 vm_assets.go:163] NewFileAsset: /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem -> /etc/ssl/certs/14382.pem
	I0906 17:05:54.670874    4382 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0906 17:05:54.676657    4382 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem --> /etc/ssl/certs/14382.pem (1708 bytes)
	I0906 17:05:54.693611    4382 start.go:303] post-start completed in 65.627983ms
	I0906 17:05:54.693652    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetConfigRaw
	I0906 17:05:54.694418    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetIP
	I0906 17:05:54.694588    4382 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/force-systemd-env-420000/config.json ...
	I0906 17:05:54.694882    4382 start.go:128] duration metric: createHost completed in 12.037454466s
	I0906 17:05:54.694899    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:54.694989    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:54.695078    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.695213    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.695328    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:54.695492    4382 main.go:141] libmachine: Using SSH client type: native
	I0906 17:05:54.695784    4382 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.19 22 <nil> <nil>}
	I0906 17:05:54.695792    4382 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0906 17:05:54.763441    4382 main.go:141] libmachine: SSH cmd err, output: <nil>: 1694045154.867929794
	
	I0906 17:05:54.763452    4382 fix.go:206] guest clock: 1694045154.867929794
	I0906 17:05:54.763458    4382 fix.go:219] Guest: 2023-09-06 17:05:54.867929794 -0700 PDT Remote: 2023-09-06 17:05:54.694893 -0700 PDT m=+12.540005122 (delta=173.036794ms)
	I0906 17:05:54.763480    4382 fix.go:190] guest clock delta is within tolerance: 173.036794ms
	I0906 17:05:54.763484    4382 start.go:83] releasing machines lock for "force-systemd-env-420000", held for 12.106121984s
	I0906 17:05:54.763503    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:54.763628    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetIP
	I0906 17:05:54.763719    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:54.763989    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:54.764094    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .DriverName
	I0906 17:05:54.764177    4382 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0906 17:05:54.764206    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:54.764218    4382 ssh_runner.go:195] Run: cat /version.json
	I0906 17:05:54.764236    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHHostname
	I0906 17:05:54.764327    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:54.764333    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHPort
	I0906 17:05:54.764414    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.764442    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHKeyPath
	I0906 17:05:54.764497    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:54.764522    4382 main.go:141] libmachine: (force-systemd-env-420000) Calling .GetSSHUsername
	I0906 17:05:54.764585    4382 sshutil.go:53] new ssh client: &{IP:192.168.64.19 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/id_rsa Username:docker}
	I0906 17:05:54.764596    4382 sshutil.go:53] new ssh client: &{IP:192.168.64.19 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/force-systemd-env-420000/id_rsa Username:docker}
	I0906 17:05:54.844988    4382 ssh_runner.go:195] Run: systemctl --version
	I0906 17:05:54.848766    4382 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0906 17:05:54.852189    4382 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0906 17:05:54.852238    4382 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0906 17:05:54.863199    4382 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0906 17:05:54.863212    4382 start.go:466] detecting cgroup driver to use...
	I0906 17:05:54.863222    4382 start.go:470] using "systemd" cgroup driver as enforced via flags
	I0906 17:05:54.863328    4382 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 17:05:54.876593    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0906 17:05:54.883756    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0906 17:05:54.890803    4382 containerd.go:145] configuring containerd to use "systemd" as cgroup driver...
	I0906 17:05:54.890859    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = true|g' /etc/containerd/config.toml"
	I0906 17:05:54.897936    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0906 17:05:54.905226    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0906 17:05:54.912822    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0906 17:05:54.921222    4382 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0906 17:05:54.929015    4382 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0906 17:05:54.936828    4382 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0906 17:05:54.944414    4382 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0906 17:05:54.952088    4382 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:05:55.040323    4382 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0906 17:05:55.053558    4382 start.go:466] detecting cgroup driver to use...
	I0906 17:05:55.053575    4382 start.go:470] using "systemd" cgroup driver as enforced via flags
	I0906 17:05:55.053646    4382 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0906 17:05:55.065601    4382 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 17:05:55.076339    4382 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0906 17:05:55.093295    4382 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 17:05:55.102084    4382 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 17:05:55.111311    4382 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0906 17:05:55.135941    4382 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 17:05:55.145001    4382 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 17:05:55.157228    4382 ssh_runner.go:195] Run: which cri-dockerd
	I0906 17:05:55.159717    4382 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0906 17:05:55.166128    4382 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0906 17:05:55.177914    4382 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0906 17:05:55.266365    4382 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0906 17:05:55.358425    4382 docker.go:535] configuring docker to use "systemd" as cgroup driver...
	I0906 17:05:55.358441    4382 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (143 bytes)
	I0906 17:05:55.370345    4382 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:05:55.456905    4382 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 17:05:56.715497    4382 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.258588365s)
	I0906 17:05:56.715612    4382 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 17:05:56.799999    4382 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0906 17:05:56.893925    4382 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 17:05:56.992440    4382 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:05:57.079926    4382 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0906 17:05:57.141137    4382 out.go:177] 
	W0906 17:05:57.163256    4382 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	W0906 17:05:57.163287    4382 out.go:239] * 
	* 
	W0906 17:05:57.164566    4382 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0906 17:05:57.228188    4382 out.go:177] 

                                                
                                                
** /stderr **
docker_test.go:157: failed to start minikube with args: "out/minikube-darwin-amd64 start -p force-systemd-env-420000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit " : exit status 90
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-420000 ssh "docker info --format {{.CgroupDriver}}"
docker_test.go:166: *** TestForceSystemdEnv FAILED at 2023-09-06 17:05:57.420924 -0700 PDT m=+1742.431512189
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-env-420000 -n force-systemd-env-420000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p force-systemd-env-420000 -n force-systemd-env-420000: exit status 6 (134.426732ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 17:05:57.544933    4398 status.go:415] kubeconfig endpoint: extract IP: "force-systemd-env-420000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "force-systemd-env-420000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
helpers_test.go:175: Cleaning up "force-systemd-env-420000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-420000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-420000: (5.298637634s)
--- FAIL: TestForceSystemdEnv (20.71s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (6.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1694043903718492000" to /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1694043903718492000" to /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1694043903718492000" to /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001/test-1694043903718492000
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (133.984781ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
E0906 16:45:03.895969    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Sep  6 23:45 created-by-test
-rw-r--r-- 1 docker docker 24 Sep  6 23:45 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Sep  6 23:45 test-1694043903718492000
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh cat /mount-9p/test-1694043903718492000
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-283000 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [67337a01-a450-4211-a3f9-3b0e8db3bfde] Pending
helpers_test.go:344: "busybox-mount" [67337a01-a450-4211-a3f9-3b0e8db3bfde] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [67337a01-a450-4211-a3f9-3b0e8db3bfde] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "busybox-mount" [67337a01-a450-4211-a3f9-3b0e8db3bfde] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 4.010536508s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-283000 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:181: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh stat /mount-9p/created-by-pod: exit status 38 (205.321901ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to HOST_CONFIG_LOAD: Error getting cluster config: unmarshal: unexpected end of JSON input
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                        │
	│    * If the above advice does not help, please let us know:                                                            │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                          │
	│                                                                                                                        │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                               │
	│    * Please also attach the following file to the GitHub issue:                                                        │
	│    * - /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube_ssh_8c30ac2c402e091c2ad857e651d7deb00f5fb573_0.log    │
	│                                                                                                                        │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test_mount_test.go:183: failed to stat the file "/mount-9p/created-by-pod" iniside minikube : args "out/minikube-darwin-amd64 -p functional-283000 ssh stat /mount-9p/created-by-pod": exit status 38
functional_test_mount_test.go:80: "TestFunctional/parallel/MountCmd/any-port" failed, getting debug info...
functional_test_mount_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates"
functional_test_mount_test.go:85: (debug) out/minikube-darwin-amd64 -p functional-283000 ssh "mount | grep 9p; ls -la /mount-9p; cat /mount-9p/pod-dates":
192.168.64.1 on /mount-9p type 9p (rw,relatime,sync,dirsync,dfltuid=1000,dfltgid=1000,access=any,msize=65536,trans=tcp,noextend,port=50360)
total 2
-rw-r--r-- 1 docker docker  5 Sep  6 23:45 created-by-pod
-rw-r--r-- 1 docker docker 24 Sep  6 23:45 created-by-test
-rw-r--r-- 1 docker docker 10 Sep  6 23:45 pod-dates
-rw-r--r-- 1 docker docker 24 Sep  6 23:45 test-1694043903718492000
test date
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001:/mount-9p --alsologtostderr -v=1] ...
functional_test_mount_test.go:94: (dbg) [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001:/mount-9p --alsologtostderr -v=1] stdout:
* Mounting host path /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001 into VM as /mount-9p ...
- Mount type:   9p
- User ID:      docker
- Group ID:     docker
- Version:      9p2000.L
- Message Size: 262144
- Options:      map[]
- Bind Address: 192.168.64.1:50360
* Userspace file server: ufs starting
* Successfully mounted /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001 to /mount-9p

                                                
                                                
* NOTE: This process must stay alive for the mount to be accessible ...
* Unmounting /mount-9p ...

                                                
                                                
functional_test_mount_test.go:94: (dbg) [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001:/mount-9p --alsologtostderr -v=1] stderr:
I0906 16:45:03.753976    2445 out.go:296] Setting OutFile to fd 1 ...
I0906 16:45:03.754236    2445 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:03.754242    2445 out.go:309] Setting ErrFile to fd 2...
I0906 16:45:03.754246    2445 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:03.754427    2445 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
I0906 16:45:03.754782    2445 mustload.go:65] Loading cluster: functional-283000
I0906 16:45:03.755041    2445 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:03.755405    2445 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:03.755453    2445 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:03.762254    2445 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50354
I0906 16:45:03.762635    2445 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:03.763062    2445 main.go:141] libmachine: Using API Version  1
I0906 16:45:03.763074    2445 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:03.763295    2445 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:03.763408    2445 main.go:141] libmachine: (functional-283000) Calling .GetState
I0906 16:45:03.763504    2445 main.go:141] libmachine: (functional-283000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0906 16:45:03.763571    2445 main.go:141] libmachine: (functional-283000) DBG | hyperkit pid from json: 1847
I0906 16:45:03.764475    2445 host.go:66] Checking if "functional-283000" exists ...
I0906 16:45:03.764720    2445 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:03.764743    2445 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:03.771750    2445 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50358
I0906 16:45:03.772068    2445 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:03.772394    2445 main.go:141] libmachine: Using API Version  1
I0906 16:45:03.772404    2445 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:03.772583    2445 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:03.772686    2445 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:03.772774    2445 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:03.772852    2445 main.go:141] libmachine: (functional-283000) Calling .GetIP
I0906 16:45:03.773605    2445 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:03.795079    2445 out.go:177] * Mounting host path /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001 into VM as /mount-9p ...
I0906 16:45:03.815937    2445 out.go:177]   - Mount type:   9p
I0906 16:45:03.874008    2445 out.go:177]   - User ID:      docker
I0906 16:45:03.895043    2445 out.go:177]   - Group ID:     docker
I0906 16:45:03.937058    2445 out.go:177]   - Version:      9p2000.L
I0906 16:45:03.958033    2445 out.go:177]   - Message Size: 262144
I0906 16:45:03.978937    2445 out.go:177]   - Options:      map[]
I0906 16:45:04.000091    2445 out.go:177]   - Bind Address: 192.168.64.1:50360
I0906 16:45:04.021186    2445 out.go:177] * Userspace file server: 
I0906 16:45:04.021490    2445 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f /mount-9p || echo "
I0906 16:45:04.042190    2445 main.go:141] libmachine: (functional-283000) Calling .GetSSHHostname
I0906 16:45:04.042491    2445 main.go:141] libmachine: (functional-283000) Calling .GetSSHPort
I0906 16:45:04.042721    2445 main.go:141] libmachine: (functional-283000) Calling .GetSSHKeyPath
I0906 16:45:04.042927    2445 main.go:141] libmachine: (functional-283000) Calling .GetSSHUsername
I0906 16:45:04.043131    2445 sshutil.go:53] new ssh client: &{IP:192.168.64.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/functional-283000/id_rsa Username:docker}
I0906 16:45:04.085626    2445 mount.go:180] unmount for /mount-9p ran successfully
I0906 16:45:04.085640    2445 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /mount-9p"
I0906 16:45:04.092739    2445 ssh_runner.go:195] Run: /bin/bash -c "sudo mount -t 9p -o dfltgid=$(grep ^docker: /etc/group | cut -d: -f3),dfltuid=$(id -u docker),msize=262144,port=50360,trans=tcp,version=9p2000.L 192.168.64.1 /mount-9p"
I0906 16:45:04.109069    2445 main.go:125] stdlog: ufs.go:141 connected
I0906 16:45:04.110410    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tversion tag 65535 msize 65536 version '9P2000.L'
I0906 16:45:04.110443    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rversion tag 65535 msize 65536 version '9P2000'
I0906 16:45:04.110665    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tattach tag 0 fid 0 afid 4294967295 uname 'nobody' nuname 0 aname ''
I0906 16:45:04.110738    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rattach tag 0 aqid (78195cb 6ce296e6 'd')
I0906 16:45:04.110993    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:04.111994    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:04.115014    2445 lock.go:50] WriteFile acquiring /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/.mount-process: {Name:mk267c5403aeb5d0d84b77b4617049b83a68375f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I0906 16:45:04.115300    2445 mount.go:105] mount successful: ""
I0906 16:45:04.136540    2445 out.go:177] * Successfully mounted /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port2736399982/001 to /mount-9p
I0906 16:45:04.158416    2445 out.go:177] 
I0906 16:45:04.179242    2445 out.go:177] * NOTE: This process must stay alive for the mount to be accessible ...
I0906 16:45:04.446639    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:04.447166    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:04.580183    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:04.580689    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:04.581689    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 
I0906 16:45:04.581729    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:04.581885    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Topen tag 0 fid 1 mode 0
I0906 16:45:04.581942    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Ropen tag 0 qid (78195cb 6ce296e6 'd') iounit 0
I0906 16:45:04.582118    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:04.582548    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:04.582923    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 0 count 65512
I0906 16:45:04.584025    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 243
I0906 16:45:04.584193    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 243 count 65269
I0906 16:45:04.584230    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:04.584381    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 243 count 65512
I0906 16:45:04.584416    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:04.584594    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'test-1694043903718492000' 
I0906 16:45:04.584637    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195ce 6ce296e6 '') 
I0906 16:45:04.584787    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:04.585156    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('test-1694043903718492000' 'jenkins' '20' '' q (78195ce 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.585319    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:04.585682    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('test-1694043903718492000' 'jenkins' '20' '' q (78195ce 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.585839    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:04.585864    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:04.586019    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'created-by-test-removed-by-pod' 
I0906 16:45:04.586058    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cd 6ce296e6 '') 
I0906 16:45:04.586198    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:04.586574    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' '20' '' q (78195cd 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.586742    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:04.587089    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' '20' '' q (78195cd 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.587256    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:04.587270    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:04.587442    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I0906 16:45:04.587485    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cc 6ce296e6 '') 
I0906 16:45:04.587641    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:04.588010    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.588181    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:04.588552    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.588725    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:04.588744    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:04.588904    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 243 count 65512
I0906 16:45:04.588933    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:04.589087    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:04.589103    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:04.723573    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'test-1694043903718492000' 
I0906 16:45:04.723639    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195ce 6ce296e6 '') 
I0906 16:45:04.723767    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:04.724215    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('test-1694043903718492000' 'jenkins' '20' '' q (78195ce 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.724428    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 1 newfid 2 
I0906 16:45:04.724459    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:04.724591    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Topen tag 0 fid 2 mode 0
I0906 16:45:04.724649    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Ropen tag 0 qid (78195ce 6ce296e6 '') iounit 0
I0906 16:45:04.724789    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:04.725190    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('test-1694043903718492000' 'jenkins' '20' '' q (78195ce 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:04.725427    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 0 count 65512
I0906 16:45:04.725466    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 24
I0906 16:45:04.725578    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 24 count 65512
I0906 16:45:04.725608    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:04.725751    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 24 count 65512
I0906 16:45:04.725784    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:04.725955    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:04.725978    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:04.726086    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:04.726100    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.736179    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:06.737008    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:06.803495    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:06.804240    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:06.806741    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:06.807180    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce296e6 'd') m d755 at 0 mt 1694043903 l 160 t 0 d 0 ext )
I0906 16:45:06.829149    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-test' 
I0906 16:45:06.829227    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cc 6ce296e6 '') 
I0906 16:45:06.829936    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:06.830570    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:06.830811    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 1 newfid 2 
I0906 16:45:06.830897    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:06.831090    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Topen tag 0 fid 2 mode 0
I0906 16:45:06.831156    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Ropen tag 0 qid (78195cc 6ce296e6 '') iounit 0
I0906 16:45:06.831330    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 0 count 65512
I0906 16:45:06.831369    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 24
I0906 16:45:06.831534    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 24 count 65512
I0906 16:45:06.831590    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:06.831807    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 24 count 65512
I0906 16:45:06.831846    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:06.832002    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 24 count 65512
I0906 16:45:06.832041    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:06.832200    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:06.832222    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.832385    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:06.832401    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.832858    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-pod' 
I0906 16:45:06.832906    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rerror tag 0 ename 'file not found' ecode 0
I0906 16:45:06.833093    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 
I0906 16:45:06.833131    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:06.833336    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tcreate tag 0 fid 1 name 'created-by-pod' perm 644 mode 1 
I0906 16:45:06.833475    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rcreate tag 0 qid (78195d8 6ce2a311 '') iounit 0
I0906 16:45:06.833653    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'created-by-pod' 
I0906 16:45:06.833698    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195d8 6ce2a311 '') 
I0906 16:45:06.833876    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:06.834443    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-pod' 'jenkins' '20' '' q (78195d8 6ce2a311 '') m 644 at 0 mt 1694043906 l 0 t 0 d 0 ext )
I0906 16:45:06.834724    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twrite tag 0 fid 1 offset 0 count 5
I0906 16:45:06.834805    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwrite tag 0 count 5
I0906 16:45:06.835171    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:06.835254    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.835423    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:06.835437    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.836703    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-test-removed-by-pod' 
I0906 16:45:06.836755    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cd 6ce296e6 '') 
I0906 16:45:06.836918    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:06.837405    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' '20' '' q (78195cd 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:06.837659    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:06.838071    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' '20' '' q (78195cd 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:06.838255    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:06.838271    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.838469    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-test-removed-by-pod' 
I0906 16:45:06.838516    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cd 6ce296e6 '') 
I0906 16:45:06.838677    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:06.839105    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' '20' '' q (78195cd 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:06.839276    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:06.839291    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.839443    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-test-removed-by-pod' 
I0906 16:45:06.839488    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cd 6ce296e6 '') 
I0906 16:45:06.839870    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:06.840457    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test-removed-by-pod' 'jenkins' '20' '' q (78195cd 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:06.840731    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 1 newfid 2 
I0906 16:45:06.840772    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:06.840943    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tremove tag 0 fid 2
I0906 16:45:06.841076    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rremove tag 0
I0906 16:45:06.841244    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:06.841263    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.841637    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-pod-removed-by-test' 
I0906 16:45:06.841679    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rerror tag 0 ename 'file not found' ecode 0
I0906 16:45:06.841820    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 
I0906 16:45:06.841848    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:06.841996    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tcreate tag 0 fid 1 name 'created-by-pod-removed-by-test' perm 644 mode 1 
I0906 16:45:06.842150    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rcreate tag 0 qid (78195d9 6ce2a31a '') iounit 0
I0906 16:45:06.842326    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'created-by-pod-removed-by-test' 
I0906 16:45:06.842369    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195d9 6ce2a31a '') 
I0906 16:45:06.842591    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:06.843004    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-pod-removed-by-test' 'jenkins' '20' '' q (78195d9 6ce2a31a '') m 644 at 0 mt 1694043906 l 0 t 0 d 0 ext )
I0906 16:45:06.843231    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 3 0:'pod-dates' 
I0906 16:45:06.843275    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rerror tag 0 ename 'file not found' ecode 0
I0906 16:45:06.843475    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 3 
I0906 16:45:06.843506    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:06.843684    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tcreate tag 0 fid 3 name 'pod-dates' perm 644 mode 1 
I0906 16:45:06.843804    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rcreate tag 0 qid (78195da 6ce2a31b '') iounit 0
I0906 16:45:06.843948    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 4 0:'pod-dates' 
I0906 16:45:06.844015    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195da 6ce2a31b '') 
I0906 16:45:06.844159    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 4
I0906 16:45:06.844549    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('pod-dates' 'jenkins' '20' '' q (78195da 6ce2a31b '') m 644 at 0 mt 1694043906 l 0 t 0 d 0 ext )
I0906 16:45:06.844715    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:06.844732    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.844875    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:06.844889    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.845066    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twrite tag 0 fid 3 offset 0 count 10
I0906 16:45:06.845131    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwrite tag 0 count 10
I0906 16:45:06.845313    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 3
I0906 16:45:06.845369    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:06.845582    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 4
I0906 16:45:06.845597    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.074221    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'created-by-test' 
I0906 16:45:09.074461    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cc 6ce296e6 '') 
I0906 16:45:09.074822    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:09.075537    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:09.076068    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:09.076591    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:09.076922    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:09.076958    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.436370    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:09.436869    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce2acbd 'd') m d755 at 0 mt 1694043909 l 192 t 0 d 0 ext )
I0906 16:45:09.438245    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 
I0906 16:45:09.438303    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:09.438538    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Topen tag 0 fid 1 mode 0
I0906 16:45:09.438625    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Ropen tag 0 qid (78195cb 6ce2acbd 'd') iounit 0
I0906 16:45:09.438796    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 0
I0906 16:45:09.439192    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('001' 'jenkins' '20' '' q (78195cb 6ce2acbd 'd') m d755 at 0 mt 1694043909 l 192 t 0 d 0 ext )
I0906 16:45:09.439437    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 0 count 65512
I0906 16:45:09.440816    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 294
I0906 16:45:09.440950    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 294 count 65218
I0906 16:45:09.440987    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:09.441111    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 294 count 65512
I0906 16:45:09.441143    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:09.441262    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'created-by-pod' 
I0906 16:45:09.441307    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195d8 6ce2a312 '') 
I0906 16:45:09.441431    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.441819    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-pod' 'jenkins' '20' '' q (78195d8 6ce2a312 '') m 644 at 0 mt 1694043906 l 5 t 0 d 0 ext )
I0906 16:45:09.441954    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.442296    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-pod' 'jenkins' '20' '' q (78195d8 6ce2a312 '') m 644 at 0 mt 1694043906 l 5 t 0 d 0 ext )
I0906 16:45:09.442416    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:09.442432    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.442556    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'pod-dates' 
I0906 16:45:09.442596    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195da 6ce2a31d '') 
I0906 16:45:09.442715    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.443090    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('pod-dates' 'jenkins' '20' '' q (78195da 6ce2a31d '') m 644 at 0 mt 1694043906 l 10 t 0 d 0 ext )
I0906 16:45:09.443217    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.443565    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('pod-dates' 'jenkins' '20' '' q (78195da 6ce2a31d '') m 644 at 0 mt 1694043906 l 10 t 0 d 0 ext )
I0906 16:45:09.443716    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:09.443732    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.443853    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'test-1694043903718492000' 
I0906 16:45:09.443899    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195ce 6ce296e6 '') 
I0906 16:45:09.444000    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.444373    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('test-1694043903718492000' 'jenkins' '20' '' q (78195ce 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:09.444519    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.444879    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('test-1694043903718492000' 'jenkins' '20' '' q (78195ce 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:09.444998    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:09.445011    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.445140    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 2 0:'created-by-test' 
I0906 16:45:09.445179    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195cc 6ce296e6 '') 
I0906 16:45:09.445294    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.445648    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:09.445784    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 2
I0906 16:45:09.446150    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('created-by-test' 'jenkins' '20' '' q (78195cc 6ce296e6 '') m 644 at 0 mt 1694043903 l 24 t 0 d 0 ext )
I0906 16:45:09.446261    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:09.446274    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.446370    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 1 offset 294 count 65512
I0906 16:45:09.446397    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:09.446494    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:09.446511    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.448001    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 0 newfid 1 0:'pod-dates' 
I0906 16:45:09.448052    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 (78195da 6ce2a31d '') 
I0906 16:45:09.448151    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:09.448562    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('pod-dates' 'jenkins' '20' '' q (78195da 6ce2a31d '') m 644 at 0 mt 1694043906 l 10 t 0 d 0 ext )
I0906 16:45:09.460184    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Twalk tag 0 fid 1 newfid 2 
I0906 16:45:09.460318    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rwalk tag 0 
I0906 16:45:09.462114    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Topen tag 0 fid 2 mode 0
I0906 16:45:09.462265    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Ropen tag 0 qid (78195da 6ce2a31d '') iounit 0
I0906 16:45:09.464467    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tstat tag 0 fid 1
I0906 16:45:09.465192    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rstat tag 0 st ('pod-dates' 'jenkins' '20' '' q (78195da 6ce2a31d '') m 644 at 0 mt 1694043906 l 10 t 0 d 0 ext )
I0906 16:45:09.465492    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 0 count 65512
I0906 16:45:09.465537    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 10
I0906 16:45:09.465819    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 10 count 65512
I0906 16:45:09.465877    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:09.466321    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tread tag 0 fid 2 offset 10 count 65512
I0906 16:45:09.466357    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rread tag 0 count 0
I0906 16:45:09.466734    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 2
I0906 16:45:09.466757    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.466936    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 1
I0906 16:45:09.466954    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.618095    2445 main.go:125] stdlog: srv_conn.go:133 >>> 192.168.64.4:45508 Tclunk tag 0 fid 0
I0906 16:45:09.618150    2445 main.go:125] stdlog: srv_conn.go:190 <<< 192.168.64.4:45508 Rclunk tag 0
I0906 16:45:09.622006    2445 main.go:125] stdlog: ufs.go:147 disconnected
I0906 16:45:09.886793    2445 out.go:177] * Unmounting /mount-9p ...
I0906 16:45:09.924559    2445 ssh_runner.go:195] Run: /bin/bash -c "[ "x$(findmnt -T /mount-9p | grep /mount-9p)" != "x" ] && sudo umount -f /mount-9p || echo "
I0906 16:45:09.932549    2445 mount.go:180] unmount for /mount-9p ran successfully
I0906 16:45:09.932662    2445 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/.mount-process: {Name:mk267c5403aeb5d0d84b77b4617049b83a68375f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I0906 16:45:09.954237    2445 out.go:177] 
--- FAIL: TestFunctional/parallel/MountCmd/any-port (6.28s)

                                                
                                    
x
+
TestImageBuild/serial/Setup (18.68s)

                                                
                                                
=== RUN   TestImageBuild/serial/Setup
image_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -p image-973000 --driver=hyperkit 
E0906 16:45:44.857846    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
image_test.go:69: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p image-973000 --driver=hyperkit : exit status 90 (18.550559028s)

                                                
                                                
-- stdout --
	* [image-973000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node image-973000 in cluster image-973000
	* Creating hyperkit VM (CPUs=2, Memory=6000MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
image_test.go:70: failed to start minikube with args: "out/minikube-darwin-amd64 start -p image-973000 --driver=hyperkit " : exit status 90
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p image-973000 -n image-973000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p image-973000 -n image-973000: exit status 6 (128.183248ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 16:45:45.993138    2676 status.go:415] kubeconfig endpoint: extract IP: "image-973000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "image-973000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestImageBuild/serial/Setup (18.68s)

                                                
                                    
x
+
TestMinikubeProfile (23.86s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-691000 --driver=hyperkit 
E0906 16:49:12.013248    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.018754    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.030895    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.053078    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.094149    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.176340    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.338098    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:12.660310    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:13.300809    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:14.581666    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:17.142523    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p first-691000 --driver=hyperkit : exit status 90 (17.962066475s)

                                                
                                                
-- stdout --
	* [first-691000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node first-691000 in cluster first-691000
	* Creating hyperkit VM (CPUs=2, Memory=6000MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
minikube_profile_test.go:46: test pre-condition failed. args "out/minikube-darwin-amd64 start -p first-691000 --driver=hyperkit ": exit status 90
panic.go:522: *** TestMinikubeProfile FAILED at 2023-09-06 16:49:18.162009 -0700 PDT m=+743.142095335
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p second-694000 -n second-694000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p second-694000 -n second-694000: exit status 85 (106.218173ms)

                                                
                                                
-- stdout --
	* Profile "second-694000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p second-694000"

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 85 (may be ok)
helpers_test.go:241: "second-694000" host is not running, skipping log retrieval (state="* Profile \"second-694000\" not found. Run \"minikube profile list\" to view all profiles.\n  To start a cluster, run: \"minikube start -p second-694000\"")
helpers_test.go:175: Cleaning up "second-694000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-694000
panic.go:522: *** TestMinikubeProfile FAILED at 2023-09-06 16:49:18.626686 -0700 PDT m=+743.606772423
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p first-691000 -n first-691000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p first-691000 -n first-691000: exit status 6 (125.062801ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 16:49:18.742418    2964 status.go:415] kubeconfig endpoint: extract IP: "first-691000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "first-691000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
helpers_test.go:175: Cleaning up "first-691000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-691000
E0906 16:49:22.281649    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:49:22.938771    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-691000: (5.30983155s)
--- FAIL: TestMinikubeProfile (23.86s)

                                                
                                    
x
+
TestPause/serial/Start (16.55s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-586000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 
pause_test.go:80: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p pause-586000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : exit status 90 (16.406543555s)

                                                
                                                
-- stdout --
	* [pause-586000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node pause-586000 in cluster pause-586000
	* Creating hyperkit VM (CPUs=2, Memory=2048MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
pause_test.go:82: failed to start minikube with args: "out/minikube-darwin-amd64 start -p pause-586000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit " : exit status 90
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-586000 -n pause-586000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p pause-586000 -n pause-586000: exit status 6 (146.154574ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 17:13:26.096494    5446 status.go:415] kubeconfig endpoint: extract IP: "pause-586000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "pause-586000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestPause/serial/Start (16.55s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (15.44s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit 
net_test.go:112: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p flannel-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit : exit status 90 (15.420101792s)

                                                
                                                
-- stdout --
	* [flannel-758000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node flannel-758000 in cluster flannel-758000
	* Creating hyperkit VM (CPUs=2, Memory=3072MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 17:18:50.035298    6708 out.go:296] Setting OutFile to fd 1 ...
	I0906 17:18:50.035983    6708 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 17:18:50.035994    6708 out.go:309] Setting ErrFile to fd 2...
	I0906 17:18:50.036001    6708 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 17:18:50.036705    6708 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 17:18:50.038348    6708 out.go:303] Setting JSON to false
	I0906 17:18:50.059771    6708 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":2903,"bootTime":1694043027,"procs":491,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 17:18:50.059887    6708 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 17:18:50.107075    6708 out.go:177] * [flannel-758000] minikube v1.31.2 on Darwin 13.5.1
	I0906 17:18:50.127215    6708 notify.go:220] Checking for updates...
	I0906 17:18:50.151017    6708 out.go:177]   - MINIKUBE_LOCATION=17174
	I0906 17:18:50.214331    6708 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 17:18:50.262187    6708 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 17:18:50.321286    6708 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 17:18:50.366435    6708 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 17:18:50.412360    6708 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0906 17:18:50.435928    6708 config.go:182] Loaded profile config "enable-default-cni-758000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 17:18:50.436023    6708 driver.go:373] Setting default libvirt URI to qemu:///system
	I0906 17:18:50.464333    6708 out.go:177] * Using the hyperkit driver based on user configuration
	I0906 17:18:50.509257    6708 start.go:298] selected driver: hyperkit
	I0906 17:18:50.509272    6708 start.go:902] validating driver "hyperkit" against <nil>
	I0906 17:18:50.509288    6708 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 17:18:50.511933    6708 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:18:50.512040    6708 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17174-977/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 17:18:50.518819    6708 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.31.2
	I0906 17:18:50.522157    6708 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 17:18:50.522175    6708 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 17:18:50.522205    6708 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0906 17:18:50.522413    6708 start_flags.go:922] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0906 17:18:50.522444    6708 cni.go:84] Creating CNI manager for "flannel"
	I0906 17:18:50.522451    6708 start_flags.go:316] Found "Flannel" CNI - setting NetworkPlugin=cni
	I0906 17:18:50.522458    6708 start_flags.go:321] config:
	{Name:flannel-758000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:3072 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.1 ClusterName:flannel-758000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:dock
er CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:flannel NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:15m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 17:18:50.522612    6708 iso.go:125] acquiring lock: {Name:mk785f5a651fb55e13065a70647b69ec2c0160e8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:18:50.583245    6708 out.go:177] * Starting control plane node flannel-758000 in cluster flannel-758000
	I0906 17:18:50.604169    6708 preload.go:132] Checking if preload exists for k8s version v1.28.1 and runtime docker
	I0906 17:18:50.604222    6708 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4
	I0906 17:18:50.604240    6708 cache.go:57] Caching tarball of preloaded images
	I0906 17:18:50.604356    6708 preload.go:174] Found /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0906 17:18:50.604365    6708 cache.go:60] Finished verifying existence of preloaded tar for  v1.28.1 on docker
	I0906 17:18:50.604466    6708 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/flannel-758000/config.json ...
	I0906 17:18:50.604488    6708 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/flannel-758000/config.json: {Name:mka10a38b01406ef6fda7f72db1bfe89af85b9dd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 17:18:50.604746    6708 start.go:365] acquiring machines lock for flannel-758000: {Name:mkf8413c64a17b8d5adfa32fa8b277d511d8f398 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 17:18:50.604794    6708 start.go:369] acquired machines lock for "flannel-758000" in 38.434µs
	I0906 17:18:50.604814    6708 start.go:93] Provisioning new machine with config: &{Name:flannel-758000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:3072 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesC
onfig:{KubernetesVersion:v1.28.1 ClusterName:flannel-758000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:flannel NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:15m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:26
2144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 17:18:50.604862    6708 start.go:125] createHost starting for "" (driver="hyperkit")
	I0906 17:18:50.626094    6708 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=3072MB, Disk=20000MB) ...
	I0906 17:18:50.626525    6708 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 17:18:50.626612    6708 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 17:18:50.634231    6708 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54747
	I0906 17:18:50.634582    6708 main.go:141] libmachine: () Calling .GetVersion
	I0906 17:18:50.635017    6708 main.go:141] libmachine: Using API Version  1
	I0906 17:18:50.635031    6708 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 17:18:50.635244    6708 main.go:141] libmachine: () Calling .GetMachineName
	I0906 17:18:50.635344    6708 main.go:141] libmachine: (flannel-758000) Calling .GetMachineName
	I0906 17:18:50.635429    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:18:50.635521    6708 start.go:159] libmachine.API.Create for "flannel-758000" (driver="hyperkit")
	I0906 17:18:50.635545    6708 client.go:168] LocalClient.Create starting
	I0906 17:18:50.635581    6708 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem
	I0906 17:18:50.635631    6708 main.go:141] libmachine: Decoding PEM data...
	I0906 17:18:50.635644    6708 main.go:141] libmachine: Parsing certificate...
	I0906 17:18:50.635708    6708 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem
	I0906 17:18:50.635739    6708 main.go:141] libmachine: Decoding PEM data...
	I0906 17:18:50.635751    6708 main.go:141] libmachine: Parsing certificate...
	I0906 17:18:50.635764    6708 main.go:141] libmachine: Running pre-create checks...
	I0906 17:18:50.635777    6708 main.go:141] libmachine: (flannel-758000) Calling .PreCreateCheck
	I0906 17:18:50.635852    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:50.636030    6708 main.go:141] libmachine: (flannel-758000) Calling .GetConfigRaw
	I0906 17:18:50.663633    6708 main.go:141] libmachine: Creating machine...
	I0906 17:18:50.663660    6708 main.go:141] libmachine: (flannel-758000) Calling .Create
	I0906 17:18:50.663868    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:50.664103    6708 main.go:141] libmachine: (flannel-758000) DBG | I0906 17:18:50.663837    6716 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 17:18:50.664171    6708 main.go:141] libmachine: (flannel-758000) Downloading /Users/jenkins/minikube-integration/17174-977/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/17174-977/.minikube/cache/iso/amd64/minikube-v1.31.0-1692872107-17120-amd64.iso...
	I0906 17:18:50.828012    6708 main.go:141] libmachine: (flannel-758000) DBG | I0906 17:18:50.827916    6716 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/id_rsa...
	I0906 17:18:51.095204    6708 main.go:141] libmachine: (flannel-758000) DBG | I0906 17:18:51.095114    6716 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/flannel-758000.rawdisk...
	I0906 17:18:51.095220    6708 main.go:141] libmachine: (flannel-758000) DBG | Writing magic tar header
	I0906 17:18:51.095231    6708 main.go:141] libmachine: (flannel-758000) DBG | Writing SSH key tar header
	I0906 17:18:51.095593    6708 main.go:141] libmachine: (flannel-758000) DBG | I0906 17:18:51.095525    6716 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000 ...
	I0906 17:18:51.423341    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:51.423382    6708 main.go:141] libmachine: (flannel-758000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/hyperkit.pid
	I0906 17:18:51.423402    6708 main.go:141] libmachine: (flannel-758000) DBG | Using UUID 1ec95ac6-4d14-11ee-99b3-149d997cd0f1
	I0906 17:18:51.445752    6708 main.go:141] libmachine: (flannel-758000) DBG | Generated MAC 52:29:8f:d5:c3:c2
	I0906 17:18:51.445788    6708 main.go:141] libmachine: (flannel-758000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=flannel-758000
	I0906 17:18:51.445842    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"1ec95ac6-4d14-11ee-99b3-149d997cd0f1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0005241b0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/initrd", Bootrom:"", CPUs:2, Memory:3072, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 17:18:51.445888    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"1ec95ac6-4d14-11ee-99b3-149d997cd0f1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0005241b0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/initrd", Bootrom:"", CPUs:2, Memory:3072, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 17:18:51.445954    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/hyperkit.pid", "-c", "2", "-m", "3072M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "1ec95ac6-4d14-11ee-99b3-149d997cd0f1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/flannel-758000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/tty,log=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/bzimage,/Users/jenkins/minikube-integration/17174-977/.minikube/machin
es/flannel-758000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=flannel-758000"}
	I0906 17:18:51.445998    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/hyperkit.pid -c 2 -m 3072M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 1ec95ac6-4d14-11ee-99b3-149d997cd0f1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/flannel-758000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/tty,log=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/console-ring -f kexec,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/bzimage,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/initrd,earlyprintk=serial loglevel=3 console
=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=flannel-758000"
	I0906 17:18:51.446018    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0906 17:18:51.448570    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 DEBUG: hyperkit: Pid is 6717
	I0906 17:18:51.448968    6708 main.go:141] libmachine: (flannel-758000) DBG | Attempt 0
	I0906 17:18:51.448980    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:51.449095    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:18:51.450082    6708 main.go:141] libmachine: (flannel-758000) DBG | Searching for 52:29:8f:d5:c3:c2 in /var/db/dhcpd_leases ...
	I0906 17:18:51.450165    6708 main.go:141] libmachine: (flannel-758000) DBG | Found 34 entries in /var/db/dhcpd_leases!
	I0906 17:18:51.450180    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:18:51.450190    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:18:51.450201    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:18:51.450208    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:18:51.450221    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:18:51.450229    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:18:51.450236    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:18:51.450243    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:18:51.450253    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:18:51.450265    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:18:51.450274    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:18:51.450282    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:18:51.450315    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:18:51.450325    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:18:51.450334    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:18:51.450341    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:18:51.450355    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:18:51.450368    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:18:51.450376    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:18:51.450384    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:18:51.450393    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:18:51.450399    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:18:51.450408    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:18:51.450415    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:18:51.450423    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:18:51.450433    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:18:51.450442    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:18:51.450457    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:18:51.450481    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:18:51.450496    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:18:51.450505    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:18:51.450513    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:18:51.450525    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:18:51.450535    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:18:51.454967    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0906 17:18:51.463592    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0906 17:18:51.464350    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 17:18:51.464365    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 17:18:51.464374    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 17:18:51.464382    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 17:18:51.850121    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0906 17:18:51.850138    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0906 17:18:51.954084    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 17:18:51.954102    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 17:18:51.954115    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 17:18:51.954130    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 17:18:51.954995    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0906 17:18:51.955006    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:51 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0906 17:18:53.451858    6708 main.go:141] libmachine: (flannel-758000) DBG | Attempt 1
	I0906 17:18:53.451889    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:53.452003    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:18:53.452945    6708 main.go:141] libmachine: (flannel-758000) DBG | Searching for 52:29:8f:d5:c3:c2 in /var/db/dhcpd_leases ...
	I0906 17:18:53.453090    6708 main.go:141] libmachine: (flannel-758000) DBG | Found 34 entries in /var/db/dhcpd_leases!
	I0906 17:18:53.453115    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:18:53.453127    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:18:53.453134    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:18:53.453156    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:18:53.453166    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:18:53.453174    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:18:53.453183    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:18:53.453198    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:18:53.453206    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:18:53.453213    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:18:53.453220    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:18:53.453228    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:18:53.453234    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:18:53.453246    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:18:53.453253    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:18:53.453259    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:18:53.453266    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:18:53.453274    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:18:53.453280    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:18:53.453289    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:18:53.453298    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:18:53.453306    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:18:53.453315    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:18:53.453325    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:18:53.453333    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:18:53.453341    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:18:53.453350    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:18:53.453357    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:18:53.453366    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:18:53.453378    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:18:53.453389    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:18:53.453413    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:18:53.453437    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:18:53.453446    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:18:55.455237    6708 main.go:141] libmachine: (flannel-758000) DBG | Attempt 2
	I0906 17:18:55.455253    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:55.455292    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:18:55.456371    6708 main.go:141] libmachine: (flannel-758000) DBG | Searching for 52:29:8f:d5:c3:c2 in /var/db/dhcpd_leases ...
	I0906 17:18:55.456434    6708 main.go:141] libmachine: (flannel-758000) DBG | Found 34 entries in /var/db/dhcpd_leases!
	I0906 17:18:55.456448    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:18:55.456466    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:18:55.456479    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:18:55.456488    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:18:55.456494    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:18:55.456506    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:18:55.456516    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:18:55.456532    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:18:55.456542    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:18:55.456550    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:18:55.456559    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:18:55.456570    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:18:55.456581    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:18:55.456593    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:18:55.456602    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:18:55.456610    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:18:55.456618    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:18:55.456630    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:18:55.456639    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:18:55.456647    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:18:55.456657    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:18:55.456664    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:18:55.456674    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:18:55.456682    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:18:55.456691    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:18:55.456699    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:18:55.456707    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:18:55.456715    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:18:55.456724    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:18:55.456733    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:18:55.456742    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:18:55.456754    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:18:55.456763    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:18:55.456772    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:18:56.968980    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:56 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0906 17:18:56.969164    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:56 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0906 17:18:56.969196    6708 main.go:141] libmachine: (flannel-758000) DBG | 2023/09/06 17:18:56 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0906 17:18:57.457100    6708 main.go:141] libmachine: (flannel-758000) DBG | Attempt 3
	I0906 17:18:57.457125    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:57.457212    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:18:57.458267    6708 main.go:141] libmachine: (flannel-758000) DBG | Searching for 52:29:8f:d5:c3:c2 in /var/db/dhcpd_leases ...
	I0906 17:18:57.458353    6708 main.go:141] libmachine: (flannel-758000) DBG | Found 34 entries in /var/db/dhcpd_leases!
	I0906 17:18:57.458364    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:18:57.458372    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:18:57.458402    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:18:57.458417    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:18:57.458434    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:18:57.458445    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:18:57.458468    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:18:57.458483    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:18:57.458493    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:18:57.458502    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:18:57.458510    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:18:57.458522    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:18:57.458540    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:18:57.458553    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:18:57.458561    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:18:57.458569    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:18:57.458577    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:18:57.458586    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:18:57.458594    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:18:57.458602    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:18:57.458610    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:18:57.458618    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:18:57.458641    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:18:57.458656    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:18:57.458665    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:18:57.458673    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:18:57.458682    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:18:57.458693    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:18:57.458708    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:18:57.458721    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:18:57.458733    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:18:57.458742    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:18:57.458751    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:18:57.458764    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:18:59.459102    6708 main.go:141] libmachine: (flannel-758000) DBG | Attempt 4
	I0906 17:18:59.459127    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:18:59.459204    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:18:59.460718    6708 main.go:141] libmachine: (flannel-758000) DBG | Searching for 52:29:8f:d5:c3:c2 in /var/db/dhcpd_leases ...
	I0906 17:18:59.460813    6708 main.go:141] libmachine: (flannel-758000) DBG | Found 34 entries in /var/db/dhcpd_leases!
	I0906 17:18:59.460843    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:18:59.460860    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:18:59.460892    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:18:59.460919    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:18:59.460967    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:18:59.460981    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:18:59.460990    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:18:59.461004    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:18:59.461016    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:18:59.461026    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:18:59.461034    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:18:59.461061    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:18:59.461073    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:18:59.461098    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:18:59.461135    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:18:59.461168    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:18:59.461194    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:18:59.461213    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:18:59.461229    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:18:59.461252    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:18:59.461273    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:18:59.461296    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:18:59.461312    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:18:59.461326    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:18:59.461338    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:18:59.461355    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:18:59.461365    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:18:59.461374    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:18:59.461403    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:18:59.461412    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:18:59.461420    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:18:59.461427    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:18:59.461443    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:18:59.461461    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:19:01.462644    6708 main.go:141] libmachine: (flannel-758000) DBG | Attempt 5
	I0906 17:19:01.462668    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:19:01.462787    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:19:01.464118    6708 main.go:141] libmachine: (flannel-758000) DBG | Searching for 52:29:8f:d5:c3:c2 in /var/db/dhcpd_leases ...
	I0906 17:19:01.464209    6708 main.go:141] libmachine: (flannel-758000) DBG | Found 35 entries in /var/db/dhcpd_leases!
	I0906 17:19:01.464228    6708 main.go:141] libmachine: (flannel-758000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:52:29:8f:d5:c3:c2 ID:1,52:29:8f:d5:c3:c2 Lease:0x64fa6873}
	I0906 17:19:01.464250    6708 main.go:141] libmachine: (flannel-758000) DBG | Found match: 52:29:8f:d5:c3:c2
	I0906 17:19:01.464259    6708 main.go:141] libmachine: (flannel-758000) DBG | IP: 192.168.64.36
	I0906 17:19:01.464298    6708 main.go:141] libmachine: (flannel-758000) Calling .GetConfigRaw
	I0906 17:19:01.464932    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:01.465085    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:01.465213    6708 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0906 17:19:01.465223    6708 main.go:141] libmachine: (flannel-758000) Calling .GetState
	I0906 17:19:01.465306    6708 main.go:141] libmachine: (flannel-758000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:19:01.465358    6708 main.go:141] libmachine: (flannel-758000) DBG | hyperkit pid from json: 6717
	I0906 17:19:01.466302    6708 main.go:141] libmachine: Detecting operating system of created instance...
	I0906 17:19:01.466313    6708 main.go:141] libmachine: Waiting for SSH to be available...
	I0906 17:19:01.466320    6708 main.go:141] libmachine: Getting to WaitForSSH function...
	I0906 17:19:01.466326    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.466414    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:01.466508    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.466612    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.466756    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:01.466895    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:01.467268    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:01.467277    6708 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0906 17:19:01.544232    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0906 17:19:01.544247    6708 main.go:141] libmachine: Detecting the provisioner...
	I0906 17:19:01.544254    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.544402    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:01.544509    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.544612    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.544707    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:01.544838    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:01.545147    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:01.545156    6708 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0906 17:19:01.620479    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g88b5c50-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I0906 17:19:01.620535    6708 main.go:141] libmachine: found compatible host: buildroot
	I0906 17:19:01.620542    6708 main.go:141] libmachine: Provisioning with buildroot...
	I0906 17:19:01.620557    6708 main.go:141] libmachine: (flannel-758000) Calling .GetMachineName
	I0906 17:19:01.620686    6708 buildroot.go:166] provisioning hostname "flannel-758000"
	I0906 17:19:01.620701    6708 main.go:141] libmachine: (flannel-758000) Calling .GetMachineName
	I0906 17:19:01.620792    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.620882    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:01.620970    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.621058    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.621157    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:01.621278    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:01.621578    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:01.621588    6708 main.go:141] libmachine: About to run SSH command:
	sudo hostname flannel-758000 && echo "flannel-758000" | sudo tee /etc/hostname
	I0906 17:19:01.705451    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: flannel-758000
	
	I0906 17:19:01.705470    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.705611    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:01.705704    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.705798    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.705909    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:01.706045    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:01.706359    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:01.706371    6708 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sflannel-758000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 flannel-758000/g' /etc/hosts;
				else 
					echo '127.0.1.1 flannel-758000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0906 17:19:01.785283    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0906 17:19:01.785301    6708 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/17174-977/.minikube CaCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/17174-977/.minikube}
	I0906 17:19:01.785321    6708 buildroot.go:174] setting up certificates
	I0906 17:19:01.785331    6708 provision.go:83] configureAuth start
	I0906 17:19:01.785343    6708 main.go:141] libmachine: (flannel-758000) Calling .GetMachineName
	I0906 17:19:01.785476    6708 main.go:141] libmachine: (flannel-758000) Calling .GetIP
	I0906 17:19:01.785590    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.785683    6708 provision.go:138] copyHostCerts
	I0906 17:19:01.785757    6708 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem, removing ...
	I0906 17:19:01.785766    6708 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem
	I0906 17:19:01.785966    6708 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem (1078 bytes)
	I0906 17:19:01.786183    6708 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem, removing ...
	I0906 17:19:01.786189    6708 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem
	I0906 17:19:01.786267    6708 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem (1123 bytes)
	I0906 17:19:01.786430    6708 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem, removing ...
	I0906 17:19:01.786436    6708 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem
	I0906 17:19:01.786509    6708 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem (1679 bytes)
	I0906 17:19:01.786642    6708 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca-key.pem org=jenkins.flannel-758000 san=[192.168.64.36 192.168.64.36 localhost 127.0.0.1 minikube flannel-758000]
	I0906 17:19:01.845745    6708 provision.go:172] copyRemoteCerts
	I0906 17:19:01.845799    6708 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0906 17:19:01.845814    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.845978    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:01.846161    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.846335    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:01.846475    6708 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/id_rsa Username:docker}
	I0906 17:19:01.888341    6708 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0906 17:19:01.904562    6708 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem --> /etc/docker/server.pem (1220 bytes)
	I0906 17:19:01.920439    6708 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0906 17:19:01.936288    6708 provision.go:86] duration metric: configureAuth took 150.945775ms
	I0906 17:19:01.936299    6708 buildroot.go:189] setting minikube options for container-runtime
	I0906 17:19:01.936428    6708 config.go:182] Loaded profile config "flannel-758000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 17:19:01.936441    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:01.936568    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:01.936648    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:01.936735    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.936823    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:01.936908    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:01.937008    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:01.937301    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:01.937309    6708 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0906 17:19:02.015008    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0906 17:19:02.015019    6708 buildroot.go:70] root file system type: tmpfs
	I0906 17:19:02.015112    6708 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0906 17:19:02.015127    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:02.015264    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:02.015364    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.015451    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.015546    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:02.015664    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:02.015963    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:02.016011    6708 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0906 17:19:02.098333    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0906 17:19:02.098358    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:02.098499    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:02.098603    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.098689    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.098781    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:02.098924    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:02.099234    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:02.099246    6708 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0906 17:19:02.610550    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0906 17:19:02.610567    6708 main.go:141] libmachine: Checking connection to Docker...
	I0906 17:19:02.610574    6708 main.go:141] libmachine: (flannel-758000) Calling .GetURL
	I0906 17:19:02.610831    6708 main.go:141] libmachine: Docker is up and running!
	I0906 17:19:02.610840    6708 main.go:141] libmachine: Reticulating splines...
	I0906 17:19:02.610858    6708 client.go:171] LocalClient.Create took 11.975441781s
	I0906 17:19:02.610870    6708 start.go:167] duration metric: libmachine.API.Create for "flannel-758000" took 11.975496959s
	I0906 17:19:02.610911    6708 start.go:300] post-start starting for "flannel-758000" (driver="hyperkit")
	I0906 17:19:02.610920    6708 start.go:329] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0906 17:19:02.610931    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:02.611171    6708 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0906 17:19:02.611193    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:02.611283    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:02.611351    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.611430    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:02.611519    6708 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/id_rsa Username:docker}
	I0906 17:19:02.654413    6708 ssh_runner.go:195] Run: cat /etc/os-release
	I0906 17:19:02.657004    6708 info.go:137] Remote host: Buildroot 2021.02.12
	I0906 17:19:02.657016    6708 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17174-977/.minikube/addons for local assets ...
	I0906 17:19:02.657099    6708 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17174-977/.minikube/files for local assets ...
	I0906 17:19:02.657265    6708 filesync.go:149] local asset: /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem -> 14382.pem in /etc/ssl/certs
	I0906 17:19:02.657450    6708 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0906 17:19:02.662980    6708 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem --> /etc/ssl/certs/14382.pem (1708 bytes)
	I0906 17:19:02.678779    6708 start.go:303] post-start completed in 67.861474ms
	I0906 17:19:02.678837    6708 main.go:141] libmachine: (flannel-758000) Calling .GetConfigRaw
	I0906 17:19:02.679564    6708 main.go:141] libmachine: (flannel-758000) Calling .GetIP
	I0906 17:19:02.679845    6708 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/flannel-758000/config.json ...
	I0906 17:19:02.680152    6708 start.go:128] duration metric: createHost completed in 12.075432935s
	I0906 17:19:02.680169    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:02.680260    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:02.680411    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.680518    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.680724    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:02.680885    6708 main.go:141] libmachine: Using SSH client type: native
	I0906 17:19:02.681176    6708 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.36 22 <nil> <nil>}
	I0906 17:19:02.681183    6708 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0906 17:19:02.756721    6708 main.go:141] libmachine: SSH cmd err, output: <nil>: 1694045942.897247204
	
	I0906 17:19:02.756733    6708 fix.go:206] guest clock: 1694045942.897247204
	I0906 17:19:02.756738    6708 fix.go:219] Guest: 2023-09-06 17:19:02.897247204 -0700 PDT Remote: 2023-09-06 17:19:02.680163 -0700 PDT m=+12.678041373 (delta=217.084204ms)
	I0906 17:19:02.756758    6708 fix.go:190] guest clock delta is within tolerance: 217.084204ms
	I0906 17:19:02.756763    6708 start.go:83] releasing machines lock for "flannel-758000", held for 12.152110088s
	I0906 17:19:02.756782    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:02.756914    6708 main.go:141] libmachine: (flannel-758000) Calling .GetIP
	I0906 17:19:02.757002    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:02.757322    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:02.757411    6708 main.go:141] libmachine: (flannel-758000) Calling .DriverName
	I0906 17:19:02.757494    6708 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0906 17:19:02.757519    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:02.757537    6708 ssh_runner.go:195] Run: cat /version.json
	I0906 17:19:02.757552    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHHostname
	I0906 17:19:02.757616    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:02.757666    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHPort
	I0906 17:19:02.757696    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.757787    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHKeyPath
	I0906 17:19:02.757802    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:02.757870    6708 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/id_rsa Username:docker}
	I0906 17:19:02.757885    6708 main.go:141] libmachine: (flannel-758000) Calling .GetSSHUsername
	I0906 17:19:02.757963    6708 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/flannel-758000/id_rsa Username:docker}
	I0906 17:19:02.796226    6708 ssh_runner.go:195] Run: systemctl --version
	I0906 17:19:02.852700    6708 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0906 17:19:02.857061    6708 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0906 17:19:02.857107    6708 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0906 17:19:02.868357    6708 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0906 17:19:02.868372    6708 start.go:466] detecting cgroup driver to use...
	I0906 17:19:02.868474    6708 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 17:19:02.880319    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0906 17:19:02.887368    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0906 17:19:02.894466    6708 containerd.go:145] configuring containerd to use "cgroupfs" as cgroup driver...
	I0906 17:19:02.894510    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0906 17:19:02.901593    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0906 17:19:02.908607    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0906 17:19:02.915691    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0906 17:19:02.922682    6708 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0906 17:19:02.929889    6708 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0906 17:19:02.936892    6708 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0906 17:19:02.943334    6708 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0906 17:19:02.949628    6708 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:19:03.033439    6708 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0906 17:19:03.045011    6708 start.go:466] detecting cgroup driver to use...
	I0906 17:19:03.045086    6708 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0906 17:19:03.058381    6708 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 17:19:03.069158    6708 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0906 17:19:03.083162    6708 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 17:19:03.092145    6708 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 17:19:03.100112    6708 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0906 17:19:03.130654    6708 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 17:19:03.139987    6708 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 17:19:03.152249    6708 ssh_runner.go:195] Run: which cri-dockerd
	I0906 17:19:03.154779    6708 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0906 17:19:03.160527    6708 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0906 17:19:03.171763    6708 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0906 17:19:03.255338    6708 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0906 17:19:03.345395    6708 docker.go:535] configuring docker to use "cgroupfs" as cgroup driver...
	I0906 17:19:03.345427    6708 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (144 bytes)
	I0906 17:19:03.357355    6708 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:19:03.442485    6708 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 17:19:04.807696    6708 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.365193116s)
	I0906 17:19:04.807762    6708 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 17:19:04.905011    6708 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0906 17:19:05.005897    6708 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 17:19:05.103452    6708 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:19:05.189353    6708 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0906 17:19:05.241062    6708 out.go:177] 
	W0906 17:19:05.262366    6708 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	W0906 17:19:05.262395    6708 out.go:239] * 
	* 
	W0906 17:19:05.263601    6708 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0906 17:19:05.347307    6708 out.go:177] 

                                                
                                                
** /stderr **
net_test.go:114: failed start: exit status 90
--- FAIL: TestNetworkPlugins/group/flannel/Start (15.44s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (15.54s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-343000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.28.1
E0906 17:20:45.610116    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:45.616017    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:45.626818    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:45.647584    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:45.688150    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:45.770240    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:45.930711    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:46.250827    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:46.891408    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:48.172738    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:20:49.920650    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:20:50.734971    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p no-preload-343000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.28.1: exit status 90 (15.381653325s)

                                                
                                                
-- stdout --
	* [no-preload-343000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on user configuration
	* Starting control plane node no-preload-343000 in cluster no-preload-343000
	* Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 17:20:36.991806    7577 out.go:296] Setting OutFile to fd 1 ...
	I0906 17:20:36.992108    7577 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 17:20:36.992118    7577 out.go:309] Setting ErrFile to fd 2...
	I0906 17:20:36.992124    7577 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 17:20:36.992447    7577 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 17:20:36.994755    7577 out.go:303] Setting JSON to false
	I0906 17:20:37.023139    7577 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":3010,"bootTime":1694043027,"procs":492,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 17:20:37.023393    7577 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 17:20:37.044249    7577 out.go:177] * [no-preload-343000] minikube v1.31.2 on Darwin 13.5.1
	I0906 17:20:37.085897    7577 notify.go:220] Checking for updates...
	I0906 17:20:37.128549    7577 out.go:177]   - MINIKUBE_LOCATION=17174
	I0906 17:20:37.176597    7577 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 17:20:37.219792    7577 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 17:20:37.264865    7577 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 17:20:37.309688    7577 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 17:20:37.354044    7577 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0906 17:20:37.378232    7577 config.go:182] Loaded profile config "old-k8s-version-077000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	I0906 17:20:37.378367    7577 driver.go:373] Setting default libvirt URI to qemu:///system
	I0906 17:20:37.453736    7577 out.go:177] * Using the hyperkit driver based on user configuration
	I0906 17:20:37.474748    7577 start.go:298] selected driver: hyperkit
	I0906 17:20:37.474779    7577 start.go:902] validating driver "hyperkit" against <nil>
	I0906 17:20:37.474797    7577 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 17:20:37.479015    7577 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.479148    7577 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17174-977/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 17:20:37.489240    7577 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.31.2
	I0906 17:20:37.494680    7577 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 17:20:37.494708    7577 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 17:20:37.494752    7577 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0906 17:20:37.495115    7577 start_flags.go:922] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0906 17:20:37.495212    7577 cni.go:84] Creating CNI manager for ""
	I0906 17:20:37.495248    7577 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0906 17:20:37.495262    7577 start_flags.go:316] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0906 17:20:37.495275    7577 start_flags.go:321] config:
	{Name:no-preload-343000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.1 ClusterName:no-preload-343000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntim
e:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 17:20:37.495530    7577 iso.go:125] acquiring lock: {Name:mk785f5a651fb55e13065a70647b69ec2c0160e8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.537651    7577 out.go:177] * Starting control plane node no-preload-343000 in cluster no-preload-343000
	I0906 17:20:37.558746    7577 preload.go:132] Checking if preload exists for k8s version v1.28.1 and runtime docker
	I0906 17:20:37.558904    7577 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/config.json ...
	I0906 17:20:37.558961    7577 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/config.json: {Name:mke07466d4c3798af273ec462557c28984bccbdf Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 17:20:37.558920    7577 cache.go:107] acquiring lock: {Name:mk7b4bf568960a4c71f88814aa07e29348e65115 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.558951    7577 cache.go:107] acquiring lock: {Name:mk24a2d1a6ee0ace10a633cb822cf2efdd99a3af Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.558993    7577 cache.go:107] acquiring lock: {Name:mk94b4a8b45db69c19210a73ebad5dac70a26d87 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.559007    7577 cache.go:107] acquiring lock: {Name:mkb4dcc07d99834fbca83f8637a9feae9c6aa58d Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.559082    7577 cache.go:107] acquiring lock: {Name:mkb028faf6a4969919ce9f16e2073889667b0e77 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.559037    7577 cache.go:107] acquiring lock: {Name:mk5b8fa544656818494a6bcbcc8f1898c7140dbe Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.559042    7577 cache.go:107] acquiring lock: {Name:mk9b5081d881064754215a6ff4ee5477bd41a73f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.559165    7577 cache.go:107] acquiring lock: {Name:mkfe22e87b12023a990813cfb51268feaf09df26 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 17:20:37.559082    7577 cache.go:115] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/gcr.io/k8s-minikube/storage-provisioner_v5 exists
	I0906 17:20:37.559286    7577 image.go:134] retrieving image: registry.k8s.io/kube-controller-manager:v1.28.1
	I0906 17:20:37.559304    7577 cache.go:96] cache image "gcr.io/k8s-minikube/storage-provisioner:v5" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/gcr.io/k8s-minikube/storage-provisioner_v5" took 349.704µs
	I0906 17:20:37.559321    7577 cache.go:80] save to tar file gcr.io/k8s-minikube/storage-provisioner:v5 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/gcr.io/k8s-minikube/storage-provisioner_v5 succeeded
	I0906 17:20:37.559372    7577 image.go:134] retrieving image: registry.k8s.io/coredns/coredns:v1.10.1
	I0906 17:20:37.559488    7577 image.go:134] retrieving image: registry.k8s.io/kube-apiserver:v1.28.1
	I0906 17:20:37.559533    7577 image.go:134] retrieving image: registry.k8s.io/kube-scheduler:v1.28.1
	I0906 17:20:37.559536    7577 start.go:365] acquiring machines lock for no-preload-343000: {Name:mkf8413c64a17b8d5adfa32fa8b277d511d8f398 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 17:20:37.559585    7577 image.go:134] retrieving image: registry.k8s.io/kube-proxy:v1.28.1
	I0906 17:20:37.559608    7577 start.go:369] acquired machines lock for "no-preload-343000" in 52.981µs
	I0906 17:20:37.559642    7577 image.go:134] retrieving image: registry.k8s.io/pause:3.9
	I0906 17:20:37.559636    7577 start.go:93] Provisioning new machine with config: &{Name:no-preload-343000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:2200 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.1 ClusterName:no-preload-343000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:2621
44 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 17:20:37.559724    7577 start.go:125] createHost starting for "" (driver="hyperkit")
	I0906 17:20:37.559734    7577 image.go:134] retrieving image: registry.k8s.io/etcd:3.5.9-0
	I0906 17:20:37.581739    7577 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0906 17:20:37.582173    7577 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 17:20:37.582221    7577 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 17:20:37.593157    7577 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:55935
	I0906 17:20:37.593694    7577 main.go:141] libmachine: () Calling .GetVersion
	I0906 17:20:37.594307    7577 main.go:141] libmachine: Using API Version  1
	I0906 17:20:37.594320    7577 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 17:20:37.594772    7577 main.go:141] libmachine: () Calling .GetMachineName
	I0906 17:20:37.594910    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetMachineName
	I0906 17:20:37.595031    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:37.595172    7577 start.go:159] libmachine.API.Create for "no-preload-343000" (driver="hyperkit")
	I0906 17:20:37.595198    7577 client.go:168] LocalClient.Create starting
	I0906 17:20:37.595241    7577 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem
	I0906 17:20:37.595302    7577 main.go:141] libmachine: Decoding PEM data...
	I0906 17:20:37.595319    7577 main.go:141] libmachine: Parsing certificate...
	I0906 17:20:37.595389    7577 main.go:141] libmachine: Reading certificate data from /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem
	I0906 17:20:37.595434    7577 main.go:141] libmachine: Decoding PEM data...
	I0906 17:20:37.595447    7577 main.go:141] libmachine: Parsing certificate...
	I0906 17:20:37.595463    7577 main.go:141] libmachine: Running pre-create checks...
	I0906 17:20:37.595471    7577 main.go:141] libmachine: (no-preload-343000) Calling .PreCreateCheck
	I0906 17:20:37.595604    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:37.595871    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetConfigRaw
	I0906 17:20:37.603112    7577 main.go:141] libmachine: Creating machine...
	I0906 17:20:37.603135    7577 main.go:141] libmachine: (no-preload-343000) Calling .Create
	I0906 17:20:37.603498    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:37.603778    7577 main.go:141] libmachine: (no-preload-343000) DBG | I0906 17:20:37.603409    7585 common.go:145] Making disk image using store path: /Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 17:20:37.603825    7577 main.go:141] libmachine: (no-preload-343000) Downloading /Users/jenkins/minikube-integration/17174-977/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/17174-977/.minikube/cache/iso/amd64/minikube-v1.31.0-1692872107-17120-amd64.iso...
	I0906 17:20:37.607168    7577 image.go:177] daemon lookup for registry.k8s.io/kube-controller-manager:v1.28.1: Error response from daemon: No such image: registry.k8s.io/kube-controller-manager:v1.28.1
	I0906 17:20:37.607169    7577 image.go:177] daemon lookup for registry.k8s.io/coredns/coredns:v1.10.1: Error response from daemon: No such image: registry.k8s.io/coredns/coredns:v1.10.1
	I0906 17:20:37.607277    7577 image.go:177] daemon lookup for registry.k8s.io/kube-proxy:v1.28.1: Error response from daemon: No such image: registry.k8s.io/kube-proxy:v1.28.1
	I0906 17:20:37.607343    7577 image.go:177] daemon lookup for registry.k8s.io/kube-scheduler:v1.28.1: Error response from daemon: No such image: registry.k8s.io/kube-scheduler:v1.28.1
	I0906 17:20:37.608742    7577 image.go:177] daemon lookup for registry.k8s.io/pause:3.9: Error response from daemon: No such image: registry.k8s.io/pause:3.9
	I0906 17:20:37.609002    7577 image.go:177] daemon lookup for registry.k8s.io/kube-apiserver:v1.28.1: Error response from daemon: No such image: registry.k8s.io/kube-apiserver:v1.28.1
	I0906 17:20:37.609038    7577 image.go:177] daemon lookup for registry.k8s.io/etcd:3.5.9-0: Error response from daemon: No such image: registry.k8s.io/etcd:3.5.9-0
	I0906 17:20:37.861960    7577 main.go:141] libmachine: (no-preload-343000) DBG | I0906 17:20:37.861607    7585 common.go:152] Creating ssh key: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/id_rsa...
	I0906 17:20:37.974458    7577 main.go:141] libmachine: (no-preload-343000) DBG | I0906 17:20:37.974393    7585 common.go:158] Creating raw disk image: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/no-preload-343000.rawdisk...
	I0906 17:20:37.974483    7577 main.go:141] libmachine: (no-preload-343000) DBG | Writing magic tar header
	I0906 17:20:37.974564    7577 main.go:141] libmachine: (no-preload-343000) DBG | Writing SSH key tar header
	I0906 17:20:37.975180    7577 main.go:141] libmachine: (no-preload-343000) DBG | I0906 17:20:37.975062    7585 common.go:172] Fixing permissions on /Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000 ...
	I0906 17:20:38.366373    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:38.366482    7577 main.go:141] libmachine: (no-preload-343000) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/hyperkit.pid
	I0906 17:20:38.366494    7577 main.go:141] libmachine: (no-preload-343000) DBG | Using UUID 5e897ede-4d14-11ee-9223-149d997cd0f1
	I0906 17:20:38.386953    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.10.1
	I0906 17:20:38.393643    7577 main.go:141] libmachine: (no-preload-343000) DBG | Generated MAC 5a:cf:31:83:54:1d
	I0906 17:20:38.393673    7577 main.go:141] libmachine: (no-preload-343000) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=no-preload-343000
	I0906 17:20:38.393718    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5e897ede-4d14-11ee-9223-149d997cd0f1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001251d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Proc
ess)(nil)}
	I0906 17:20:38.393757    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"5e897ede-4d14-11ee-9223-149d997cd0f1", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0001251d0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/bzimage", Initrd:"/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Proc
ess)(nil)}
	I0906 17:20:38.393806    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "5e897ede-4d14-11ee-9223-149d997cd0f1", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/no-preload-343000.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/tty,log=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/bzimage,/Users/jenkins/minikube-integration/17
174-977/.minikube/machines/no-preload-343000/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=no-preload-343000"}
	I0906 17:20:38.393891    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 5e897ede-4d14-11ee-9223-149d997cd0f1 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/no-preload-343000.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/tty,log=/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/console-ring -f kexec,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/bzimage,/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/initrd,earlyprint
k=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=no-preload-343000"
	I0906 17:20:38.393913    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0906 17:20:38.398633    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 DEBUG: hyperkit: Pid is 7617
	I0906 17:20:38.399461    7577 main.go:141] libmachine: (no-preload-343000) DBG | Attempt 0
	I0906 17:20:38.399491    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:38.399556    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:38.400878    7577 main.go:141] libmachine: (no-preload-343000) DBG | Searching for 5a:cf:31:83:54:1d in /var/db/dhcpd_leases ...
	I0906 17:20:38.401040    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found 38 entries in /var/db/dhcpd_leases!
	I0906 17:20:38.401055    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:5e:ee:4b:f:82:d3 ID:1,5e:ee:4b:f:82:d3 Lease:0x64fa68cf}
	I0906 17:20:38.401072    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:6a:a:3d:1:9d:5c ID:1,6a:a:3d:1:9d:5c Lease:0x64fa6893}
	I0906 17:20:38.401086    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:96:fc:73:76:1d:90 ID:1,96:fc:73:76:1d:90 Lease:0x64fa6882}
	I0906 17:20:38.401102    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:52:29:8f:d5:c3:c2 ID:1,52:29:8f:d5:c3:c2 Lease:0x64fa6873}
	I0906 17:20:38.401117    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:20:38.401132    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:20:38.401145    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:20:38.401158    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:20:38.401187    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:20:38.401207    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:20:38.401222    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:20:38.401234    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:20:38.401247    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:20:38.401258    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:20:38.401281    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:20:38.401293    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:20:38.401304    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:20:38.401325    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:20:38.401344    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:20:38.401366    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:20:38.401378    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:20:38.401397    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:20:38.401415    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:20:38.401430    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:20:38.401443    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:20:38.401462    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:20:38.401486    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:20:38.401515    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:20:38.401535    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:20:38.401562    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:20:38.401578    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:20:38.401642    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:20:38.401671    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:20:38.401694    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:20:38.401715    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:20:38.401730    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:20:38.401744    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:20:38.401759    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:20:38.406510    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0906 17:20:38.417721    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0906 17:20:38.418628    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 17:20:38.418649    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 17:20:38.418671    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 17:20:38.418685    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 17:20:38.617499    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.28.1
	I0906 17:20:38.673838    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.10.1 exists
	I0906 17:20:38.673859    7577 cache.go:96] cache image "registry.k8s.io/coredns/coredns:v1.10.1" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.10.1" took 1.114884442s
	I0906 17:20:38.673870    7577 cache.go:80] save to tar file registry.k8s.io/coredns/coredns:v1.10.1 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/coredns/coredns_v1.10.1 succeeded
	I0906 17:20:38.833576    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0906 17:20:38.833610    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0906 17:20:38.901303    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.28.1
	I0906 17:20:38.938022    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 17:20:38.938058    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 17:20:38.938072    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 17:20:38.938085    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 17:20:38.938819    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0906 17:20:38.938833    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:38 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0906 17:20:39.233453    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.28.1
	I0906 17:20:39.583270    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.28.1 exists
	I0906 17:20:39.583312    7577 cache.go:96] cache image "registry.k8s.io/kube-proxy:v1.28.1" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.28.1" took 2.024374503s
	I0906 17:20:39.583325    7577 cache.go:80] save to tar file registry.k8s.io/kube-proxy:v1.28.1 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-proxy_v1.28.1 succeeded
	I0906 17:20:39.592550    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9
	I0906 17:20:39.645632    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9 exists
	I0906 17:20:39.645651    7577 cache.go:96] cache image "registry.k8s.io/pause:3.9" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9" took 2.086575668s
	I0906 17:20:39.645659    7577 cache.go:80] save to tar file registry.k8s.io/pause:3.9 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/pause_3.9 succeeded
	I0906 17:20:39.933263    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.28.1
	I0906 17:20:40.105345    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.28.1 exists
	I0906 17:20:40.105363    7577 cache.go:96] cache image "registry.k8s.io/kube-controller-manager:v1.28.1" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.28.1" took 2.546475929s
	I0906 17:20:40.105371    7577 cache.go:80] save to tar file registry.k8s.io/kube-controller-manager:v1.28.1 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-controller-manager_v1.28.1 succeeded
	I0906 17:20:40.306422    7577 cache.go:162] opening:  /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.9-0
	I0906 17:20:40.360932    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.28.1 exists
	I0906 17:20:40.360955    7577 cache.go:96] cache image "registry.k8s.io/kube-scheduler:v1.28.1" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.28.1" took 2.802027648s
	I0906 17:20:40.360963    7577 cache.go:80] save to tar file registry.k8s.io/kube-scheduler:v1.28.1 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-scheduler_v1.28.1 succeeded
	I0906 17:20:40.401498    7577 main.go:141] libmachine: (no-preload-343000) DBG | Attempt 1
	I0906 17:20:40.401518    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:40.401548    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:40.402446    7577 main.go:141] libmachine: (no-preload-343000) DBG | Searching for 5a:cf:31:83:54:1d in /var/db/dhcpd_leases ...
	I0906 17:20:40.402468    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found 38 entries in /var/db/dhcpd_leases!
	I0906 17:20:40.402484    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:5e:ee:4b:f:82:d3 ID:1,5e:ee:4b:f:82:d3 Lease:0x64fa68cf}
	I0906 17:20:40.402495    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:6a:a:3d:1:9d:5c ID:1,6a:a:3d:1:9d:5c Lease:0x64fa6893}
	I0906 17:20:40.402510    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:96:fc:73:76:1d:90 ID:1,96:fc:73:76:1d:90 Lease:0x64fa6882}
	I0906 17:20:40.402518    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:52:29:8f:d5:c3:c2 ID:1,52:29:8f:d5:c3:c2 Lease:0x64fa6873}
	I0906 17:20:40.402561    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:20:40.402588    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:20:40.402602    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:20:40.402613    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:20:40.402622    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:20:40.402632    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:20:40.402643    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:20:40.402652    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:20:40.402661    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:20:40.402670    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:20:40.402679    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:20:40.402687    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:20:40.402695    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:20:40.402704    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:20:40.402715    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:20:40.402724    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:20:40.402737    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:20:40.402744    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:20:40.402752    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:20:40.402767    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:20:40.402785    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:20:40.402795    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:20:40.402803    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:20:40.402811    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:20:40.402819    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:20:40.402826    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:20:40.402835    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:20:40.402845    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:20:40.402853    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:20:40.402865    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:20:40.402873    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:20:40.402882    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:20:40.402895    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:20:40.402909    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:20:41.285676    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.28.1 exists
	I0906 17:20:41.285702    7577 cache.go:96] cache image "registry.k8s.io/kube-apiserver:v1.28.1" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.28.1" took 3.726675461s
	I0906 17:20:41.285713    7577 cache.go:80] save to tar file registry.k8s.io/kube-apiserver:v1.28.1 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/kube-apiserver_v1.28.1 succeeded
	I0906 17:20:42.303541    7577 cache.go:157] /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.9-0 exists
	I0906 17:20:42.303560    7577 cache.go:96] cache image "registry.k8s.io/etcd:3.5.9-0" -> "/Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.9-0" took 4.744612284s
	I0906 17:20:42.303568    7577 cache.go:80] save to tar file registry.k8s.io/etcd:3.5.9-0 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/images/amd64/registry.k8s.io/etcd_3.5.9-0 succeeded
	I0906 17:20:42.303582    7577 cache.go:87] Successfully saved all images to host disk.
	I0906 17:20:42.403703    7577 main.go:141] libmachine: (no-preload-343000) DBG | Attempt 2
	I0906 17:20:42.403727    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:42.403741    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:42.404530    7577 main.go:141] libmachine: (no-preload-343000) DBG | Searching for 5a:cf:31:83:54:1d in /var/db/dhcpd_leases ...
	I0906 17:20:42.404605    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found 38 entries in /var/db/dhcpd_leases!
	I0906 17:20:42.404617    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:5e:ee:4b:f:82:d3 ID:1,5e:ee:4b:f:82:d3 Lease:0x64fa68cf}
	I0906 17:20:42.404625    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:6a:a:3d:1:9d:5c ID:1,6a:a:3d:1:9d:5c Lease:0x64fa6893}
	I0906 17:20:42.404635    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:96:fc:73:76:1d:90 ID:1,96:fc:73:76:1d:90 Lease:0x64fa6882}
	I0906 17:20:42.404642    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:52:29:8f:d5:c3:c2 ID:1,52:29:8f:d5:c3:c2 Lease:0x64fa6873}
	I0906 17:20:42.404652    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:20:42.404662    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:20:42.404670    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:20:42.404678    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:20:42.404685    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:20:42.404695    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:20:42.404702    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:20:42.404710    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:20:42.404717    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:20:42.404759    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:20:42.404776    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:20:42.404786    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:20:42.404793    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:20:42.404799    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:20:42.404817    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:20:42.404832    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:20:42.404841    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:20:42.404852    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:20:42.404861    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:20:42.404868    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:20:42.404875    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:20:42.404882    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:20:42.404891    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:20:42.404900    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:20:42.404911    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:20:42.404919    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:20:42.404928    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:20:42.404936    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:20:42.404944    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:20:42.404953    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:20:42.404961    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:20:42.404971    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:20:42.404979    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:20:42.404986    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:20:44.078949    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:44 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 0
	I0906 17:20:44.078999    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:44 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 0
	I0906 17:20:44.079009    7577 main.go:141] libmachine: (no-preload-343000) DBG | 2023/09/06 17:20:44 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 0
	I0906 17:20:44.404917    7577 main.go:141] libmachine: (no-preload-343000) DBG | Attempt 3
	I0906 17:20:44.404939    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:44.405048    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:44.406325    7577 main.go:141] libmachine: (no-preload-343000) DBG | Searching for 5a:cf:31:83:54:1d in /var/db/dhcpd_leases ...
	I0906 17:20:44.406400    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found 38 entries in /var/db/dhcpd_leases!
	I0906 17:20:44.406411    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:5e:ee:4b:f:82:d3 ID:1,5e:ee:4b:f:82:d3 Lease:0x64fa68cf}
	I0906 17:20:44.406422    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:6a:a:3d:1:9d:5c ID:1,6a:a:3d:1:9d:5c Lease:0x64fa6893}
	I0906 17:20:44.406432    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:96:fc:73:76:1d:90 ID:1,96:fc:73:76:1d:90 Lease:0x64fa6882}
	I0906 17:20:44.406439    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:52:29:8f:d5:c3:c2 ID:1,52:29:8f:d5:c3:c2 Lease:0x64fa6873}
	I0906 17:20:44.406446    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:20:44.406466    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:20:44.406483    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:20:44.406542    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:20:44.406566    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:20:44.406594    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:20:44.406626    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:20:44.406644    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:20:44.406662    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:20:44.406677    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:20:44.406687    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:20:44.406695    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:20:44.406704    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:20:44.406713    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:20:44.406722    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:20:44.406737    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:20:44.406749    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:20:44.406762    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:20:44.406771    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:20:44.406833    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:20:44.406843    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:20:44.406852    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:20:44.406860    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:20:44.406866    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:20:44.406879    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:20:44.406902    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:20:44.406916    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:20:44.406931    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:20:44.406956    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:20:44.406987    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:20:44.406995    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:20:44.407002    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:20:44.407047    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:20:44.407083    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:20:46.407464    7577 main.go:141] libmachine: (no-preload-343000) DBG | Attempt 4
	I0906 17:20:46.407486    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:46.407565    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:46.408370    7577 main.go:141] libmachine: (no-preload-343000) DBG | Searching for 5a:cf:31:83:54:1d in /var/db/dhcpd_leases ...
	I0906 17:20:46.408491    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found 38 entries in /var/db/dhcpd_leases!
	I0906 17:20:46.408507    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:5e:ee:4b:f:82:d3 ID:1,5e:ee:4b:f:82:d3 Lease:0x64fa68cf}
	I0906 17:20:46.408526    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:6a:a:3d:1:9d:5c ID:1,6a:a:3d:1:9d:5c Lease:0x64fa6893}
	I0906 17:20:46.408538    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:96:fc:73:76:1d:90 ID:1,96:fc:73:76:1d:90 Lease:0x64fa6882}
	I0906 17:20:46.408570    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:52:29:8f:d5:c3:c2 ID:1,52:29:8f:d5:c3:c2 Lease:0x64fa6873}
	I0906 17:20:46.408585    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:66:4e:30:d9:47:d7 ID:1,66:4e:30:d9:47:d7 Lease:0x64fa6834}
	I0906 17:20:46.408596    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:fa:df:3a:c3:c7:4f ID:1,fa:df:3a:c3:c7:4f Lease:0x64fa67fd}
	I0906 17:20:46.408604    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:6e:75:83:a:6:d3 ID:1,6e:75:83:a:6:d3 Lease:0x64fa67db}
	I0906 17:20:46.408626    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:32:c0:88:8f:d8:c8 ID:1,32:c0:88:8f:d8:c8 Lease:0x64fa678d}
	I0906 17:20:46.408638    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:a2:cb:83:ca:9f:9f ID:1,a2:cb:83:ca:9f:9f Lease:0x64fa6780}
	I0906 17:20:46.408651    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:3a:49:b7:fd:2c:43 ID:1,3a:49:b7:fd:2c:43 Lease:0x64f915f6}
	I0906 17:20:46.408668    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:36:6f:40:79:5c:7c ID:1,36:6f:40:79:5c:7c Lease:0x64fa673b}
	I0906 17:20:46.408678    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:a:db:ff:ae:d:c3 ID:1,a:db:ff:ae:d:c3 Lease:0x64f915d0}
	I0906 17:20:46.408686    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:1a:bc:4:b8:d1:99 ID:1,1a:bc:4:b8:d1:99 Lease:0x64f915a7}
	I0906 17:20:46.408694    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:e:0:7c:b1:ce:3c ID:1,e:0:7c:b1:ce:3c Lease:0x64fa66f5}
	I0906 17:20:46.408705    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:5e:5d:83:4f:93:3e ID:1,5e:5d:83:4f:93:3e Lease:0x64fa66ce}
	I0906 17:20:46.408713    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:ca:4f:9:bd:c3:a ID:1,ca:4f:9:bd:c3:a Lease:0x64fa65dc}
	I0906 17:20:46.408723    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:fe:47:5:8:4f:d8 ID:1,fe:47:5:8:4f:d8 Lease:0x64fa65a8}
	I0906 17:20:46.408731    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:52:f3:97:30:7c:3e ID:1,52:f3:97:30:7c:3e Lease:0x64fa659c}
	I0906 17:20:46.408739    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:ca:25:3e:9c:2b:8e ID:1,ca:25:3e:9c:2b:8e Lease:0x64f9141d}
	I0906 17:20:46.408747    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:66:74:35:91:71:b5 ID:1,66:74:35:91:71:b5 Lease:0x64fa6574}
	I0906 17:20:46.408766    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:76:4a:f6:89:c1:bc ID:1,76:4a:f6:89:c1:bc Lease:0x64fa655f}
	I0906 17:20:46.408773    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:62:6d:96:a2:5b:0 ID:1,62:6d:96:a2:5b:0 Lease:0x64fa654a}
	I0906 17:20:46.408781    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:86:cc:8c:2e:5:32 ID:1,86:cc:8c:2e:5:32 Lease:0x64fa64df}
	I0906 17:20:46.408791    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:42:e6:21:d6:59:ae ID:1,42:e6:21:d6:59:ae Lease:0x64fa6474}
	I0906 17:20:46.408799    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:26:23:54:67:45:91 ID:1,26:23:54:67:45:91 Lease:0x64fa643b}
	I0906 17:20:46.408807    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:92:72:33:2f:fc:4a ID:1,92:72:33:2f:fc:4a Lease:0x64fa63af}
	I0906 17:20:46.408815    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ea:20:78:f9:5:1a ID:1,ea:20:78:f9:5:1a Lease:0x64f911af}
	I0906 17:20:46.408824    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:ea:74:70:f4:fd:ef ID:1,ea:74:70:f4:fd:ef Lease:0x64fa6379}
	I0906 17:20:46.408832    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:4e:4b:19:9e:af:ca ID:1,4e:4b:19:9e:af:ca Lease:0x64fa6349}
	I0906 17:20:46.408841    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:46:b3:d2:13:38:42 ID:1,46:b3:d2:13:38:42 Lease:0x64f9103c}
	I0906 17:20:46.408849    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:62:e7:79:ad:2e:98 ID:1,62:e7:79:ad:2e:98 Lease:0x64f91026}
	I0906 17:20:46.408859    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:46:98:ff:ea:32:6c ID:1,46:98:ff:ea:32:6c Lease:0x64fa6176}
	I0906 17:20:46.408871    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:de:45:16:ba:20:f8 ID:1,de:45:16:ba:20:f8 Lease:0x64fa6138}
	I0906 17:20:46.408880    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:a2:95:d5:a5:48:9f ID:1,a2:95:d5:a5:48:9f Lease:0x64fa60c0}
	I0906 17:20:46.408888    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:da:8a:34:c8:9f:61 ID:1,da:8a:34:c8:9f:61 Lease:0x64fa60a2}
	I0906 17:20:46.408897    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:9e:b2:d8:35:b4:2c ID:1,9e:b2:d8:35:b4:2c Lease:0x64fa5fbb}
	I0906 17:20:46.408905    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:52:f2:93:82:a0:72 ID:1,52:f2:93:82:a0:72 Lease:0x64f90e2f}
	I0906 17:20:46.408914    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:b6:52:30:fb:fe:9d ID:1,b6:52:30:fb:fe:9d Lease:0x64fa5eb6}
	I0906 17:20:48.408747    7577 main.go:141] libmachine: (no-preload-343000) DBG | Attempt 5
	I0906 17:20:48.408764    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:48.408830    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:48.409661    7577 main.go:141] libmachine: (no-preload-343000) DBG | Searching for 5a:cf:31:83:54:1d in /var/db/dhcpd_leases ...
	I0906 17:20:48.409726    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found 39 entries in /var/db/dhcpd_leases!
	I0906 17:20:48.409736    7577 main.go:141] libmachine: (no-preload-343000) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:5a:cf:31:83:54:1d ID:1,5a:cf:31:83:54:1d Lease:0x64fa68df}
	I0906 17:20:48.409763    7577 main.go:141] libmachine: (no-preload-343000) DBG | Found match: 5a:cf:31:83:54:1d
	I0906 17:20:48.409780    7577 main.go:141] libmachine: (no-preload-343000) DBG | IP: 192.168.64.40
	I0906 17:20:48.409814    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetConfigRaw
	I0906 17:20:48.410374    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:48.410493    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:48.410599    7577 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0906 17:20:48.410609    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetState
	I0906 17:20:48.410686    7577 main.go:141] libmachine: (no-preload-343000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 17:20:48.410750    7577 main.go:141] libmachine: (no-preload-343000) DBG | hyperkit pid from json: 7617
	I0906 17:20:48.411569    7577 main.go:141] libmachine: Detecting operating system of created instance...
	I0906 17:20:48.411584    7577 main.go:141] libmachine: Waiting for SSH to be available...
	I0906 17:20:48.411594    7577 main.go:141] libmachine: Getting to WaitForSSH function...
	I0906 17:20:48.411601    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:48.411695    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:48.411805    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.411898    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.412009    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:48.412149    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:48.412575    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:48.412584    7577 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0906 17:20:48.492470    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0906 17:20:48.492482    7577 main.go:141] libmachine: Detecting the provisioner...
	I0906 17:20:48.492488    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:48.492659    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:48.492786    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.492903    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.493042    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:48.493218    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:48.493563    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:48.493572    7577 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0906 17:20:48.572841    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g88b5c50-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I0906 17:20:48.572904    7577 main.go:141] libmachine: found compatible host: buildroot
	I0906 17:20:48.572915    7577 main.go:141] libmachine: Provisioning with buildroot...
	I0906 17:20:48.572924    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetMachineName
	I0906 17:20:48.573108    7577 buildroot.go:166] provisioning hostname "no-preload-343000"
	I0906 17:20:48.573121    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetMachineName
	I0906 17:20:48.573235    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:48.573331    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:48.573438    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.573533    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.573631    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:48.573784    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:48.574110    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:48.574120    7577 main.go:141] libmachine: About to run SSH command:
	sudo hostname no-preload-343000 && echo "no-preload-343000" | sudo tee /etc/hostname
	I0906 17:20:48.656906    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: no-preload-343000
	
	I0906 17:20:48.656925    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:48.657054    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:48.657145    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.657252    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.657366    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:48.657501    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:48.657830    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:48.657843    7577 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sno-preload-343000' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 no-preload-343000/g' /etc/hosts;
				else 
					echo '127.0.1.1 no-preload-343000' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0906 17:20:48.737357    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0906 17:20:48.737377    7577 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/17174-977/.minikube CaCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/17174-977/.minikube}
	I0906 17:20:48.737390    7577 buildroot.go:174] setting up certificates
	I0906 17:20:48.737402    7577 provision.go:83] configureAuth start
	I0906 17:20:48.737411    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetMachineName
	I0906 17:20:48.737573    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetIP
	I0906 17:20:48.737688    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:48.737774    7577 provision.go:138] copyHostCerts
	I0906 17:20:48.737861    7577 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem, removing ...
	I0906 17:20:48.737871    7577 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem
	I0906 17:20:48.738015    7577 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/ca.pem (1078 bytes)
	I0906 17:20:48.738225    7577 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem, removing ...
	I0906 17:20:48.738237    7577 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem
	I0906 17:20:48.738309    7577 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/cert.pem (1123 bytes)
	I0906 17:20:48.738464    7577 exec_runner.go:144] found /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem, removing ...
	I0906 17:20:48.738469    7577 exec_runner.go:203] rm: /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem
	I0906 17:20:48.738539    7577 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/17174-977/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/17174-977/.minikube/key.pem (1679 bytes)
	I0906 17:20:48.738673    7577 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca-key.pem org=jenkins.no-preload-343000 san=[192.168.64.40 192.168.64.40 localhost 127.0.0.1 minikube no-preload-343000]
	I0906 17:20:48.929287    7577 provision.go:172] copyRemoteCerts
	I0906 17:20:48.929349    7577 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0906 17:20:48.929365    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:48.929510    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:48.929614    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:48.929723    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:48.929814    7577 sshutil.go:53] new ssh client: &{IP:192.168.64.40 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/id_rsa Username:docker}
	I0906 17:20:48.974411    7577 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1078 bytes)
	I0906 17:20:48.990653    7577 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server.pem --> /etc/docker/server.pem (1229 bytes)
	I0906 17:20:49.007834    7577 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0906 17:20:49.023936    7577 provision.go:86] duration metric: configureAuth took 286.521201ms
	I0906 17:20:49.023948    7577 buildroot.go:189] setting minikube options for container-runtime
	I0906 17:20:49.024156    7577 config.go:182] Loaded profile config "no-preload-343000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 17:20:49.024196    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:49.024424    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.024642    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.024797    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.024952    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.025116    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.025337    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:49.025671    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:49.025683    7577 main.go:141] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0906 17:20:49.101468    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0906 17:20:49.101478    7577 buildroot.go:70] root file system type: tmpfs
	I0906 17:20:49.101556    7577 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0906 17:20:49.101568    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.101698    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.101800    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.101887    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.101996    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.102134    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:49.102432    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:49.102479    7577 main.go:141] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0906 17:20:49.187760    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0906 17:20:49.187782    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.187924    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.188010    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.188100    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.188197    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.188333    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:49.188634    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:49.188647    7577 main.go:141] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0906 17:20:49.695068    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0906 17:20:49.695098    7577 main.go:141] libmachine: Checking connection to Docker...
	I0906 17:20:49.695127    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetURL
	I0906 17:20:49.695352    7577 main.go:141] libmachine: Docker is up and running!
	I0906 17:20:49.695388    7577 main.go:141] libmachine: Reticulating splines...
	I0906 17:20:49.695406    7577 client.go:171] LocalClient.Create took 12.100352702s
	I0906 17:20:49.695441    7577 start.go:167] duration metric: libmachine.API.Create for "no-preload-343000" took 12.100418903s
	I0906 17:20:49.695451    7577 start.go:300] post-start starting for "no-preload-343000" (driver="hyperkit")
	I0906 17:20:49.695461    7577 start.go:329] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0906 17:20:49.695492    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:49.695681    7577 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0906 17:20:49.695694    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.695862    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.695995    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.696159    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.696371    7577 sshutil.go:53] new ssh client: &{IP:192.168.64.40 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/id_rsa Username:docker}
	I0906 17:20:49.740522    7577 ssh_runner.go:195] Run: cat /etc/os-release
	I0906 17:20:49.743147    7577 info.go:137] Remote host: Buildroot 2021.02.12
	I0906 17:20:49.743159    7577 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17174-977/.minikube/addons for local assets ...
	I0906 17:20:49.743247    7577 filesync.go:126] Scanning /Users/jenkins/minikube-integration/17174-977/.minikube/files for local assets ...
	I0906 17:20:49.743408    7577 filesync.go:149] local asset: /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem -> 14382.pem in /etc/ssl/certs
	I0906 17:20:49.743585    7577 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0906 17:20:49.750041    7577 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/ssl/certs/14382.pem --> /etc/ssl/certs/14382.pem (1708 bytes)
	I0906 17:20:49.765728    7577 start.go:303] post-start completed in 70.269681ms
	I0906 17:20:49.765754    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetConfigRaw
	I0906 17:20:49.766347    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetIP
	I0906 17:20:49.766491    7577 profile.go:148] Saving config to /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/config.json ...
	I0906 17:20:49.766809    7577 start.go:128] duration metric: createHost completed in 12.207223755s
	I0906 17:20:49.766825    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.766908    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.767012    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.767094    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.767167    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.767272    7577 main.go:141] libmachine: Using SSH client type: native
	I0906 17:20:49.767569    7577 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x140e6c0] 0x1411760 <nil>  [] 0s} 192.168.64.40 22 <nil> <nil>}
	I0906 17:20:49.767577    7577 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0906 17:20:49.842709    7577 main.go:141] libmachine: SSH cmd err, output: <nil>: 1694046049.996531144
	
	I0906 17:20:49.842721    7577 fix.go:206] guest clock: 1694046049.996531144
	I0906 17:20:49.842726    7577 fix.go:219] Guest: 2023-09-06 17:20:49.996531144 -0700 PDT Remote: 2023-09-06 17:20:49.766817 -0700 PDT m=+12.818605930 (delta=229.714144ms)
	I0906 17:20:49.842741    7577 fix.go:190] guest clock delta is within tolerance: 229.714144ms
	I0906 17:20:49.842745    7577 start.go:83] releasing machines lock for "no-preload-343000", held for 12.283280467s
	I0906 17:20:49.842771    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:49.842911    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetIP
	I0906 17:20:49.843007    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:49.843336    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:49.843447    7577 main.go:141] libmachine: (no-preload-343000) Calling .DriverName
	I0906 17:20:49.843516    7577 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0906 17:20:49.843547    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.843577    7577 ssh_runner.go:195] Run: cat /version.json
	I0906 17:20:49.843589    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHHostname
	I0906 17:20:49.843651    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.843686    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHPort
	I0906 17:20:49.843763    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.843778    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHKeyPath
	I0906 17:20:49.843891    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.843912    7577 main.go:141] libmachine: (no-preload-343000) Calling .GetSSHUsername
	I0906 17:20:49.843970    7577 sshutil.go:53] new ssh client: &{IP:192.168.64.40 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/id_rsa Username:docker}
	I0906 17:20:49.843988    7577 sshutil.go:53] new ssh client: &{IP:192.168.64.40 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/no-preload-343000/id_rsa Username:docker}
	I0906 17:20:49.933774    7577 ssh_runner.go:195] Run: systemctl --version
	I0906 17:20:49.937749    7577 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0906 17:20:49.941240    7577 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0906 17:20:49.941294    7577 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0906 17:20:49.950979    7577 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0906 17:20:49.950995    7577 start.go:466] detecting cgroup driver to use...
	I0906 17:20:49.951100    7577 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 17:20:49.963709    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0906 17:20:49.971117    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0906 17:20:49.978236    7577 containerd.go:145] configuring containerd to use "cgroupfs" as cgroup driver...
	I0906 17:20:49.978279    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0906 17:20:49.985317    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0906 17:20:49.992330    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0906 17:20:49.999634    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0906 17:20:50.006654    7577 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0906 17:20:50.014128    7577 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0906 17:20:50.021745    7577 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0906 17:20:50.028106    7577 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0906 17:20:50.034409    7577 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:20:50.129298    7577 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0906 17:20:50.141740    7577 start.go:466] detecting cgroup driver to use...
	I0906 17:20:50.141810    7577 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0906 17:20:50.154838    7577 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 17:20:50.165450    7577 ssh_runner.go:195] Run: sudo systemctl stop -f containerd
	I0906 17:20:50.181166    7577 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 17:20:50.190000    7577 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 17:20:50.198411    7577 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0906 17:20:50.218370    7577 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 17:20:50.227232    7577 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 17:20:50.240029    7577 ssh_runner.go:195] Run: which cri-dockerd
	I0906 17:20:50.242502    7577 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/cri-docker.service.d
	I0906 17:20:50.248199    7577 ssh_runner.go:362] scp memory --> /etc/systemd/system/cri-docker.service.d/10-cni.conf (189 bytes)
	I0906 17:20:50.259676    7577 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0906 17:20:50.353478    7577 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0906 17:20:50.446480    7577 docker.go:535] configuring docker to use "cgroupfs" as cgroup driver...
	I0906 17:20:50.446494    7577 ssh_runner.go:362] scp memory --> /etc/docker/daemon.json (144 bytes)
	I0906 17:20:50.458194    7577 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:20:50.541372    7577 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 17:20:51.789015    7577 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.247640691s)
	I0906 17:20:51.789076    7577 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 17:20:51.875879    7577 ssh_runner.go:195] Run: sudo systemctl unmask cri-docker.socket
	I0906 17:20:51.976343    7577 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 17:20:52.068308    7577 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 17:20:52.155009    7577 ssh_runner.go:195] Run: sudo systemctl restart cri-docker.socket
	I0906 17:20:52.190899    7577 out.go:177] 
	W0906 17:20:52.212618    7577 out.go:239] X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	X Exiting due to RUNTIME_ENABLE: Failed to enable container runtime: sudo systemctl restart cri-docker.socket: Process exited with status 1
	stdout:
	
	stderr:
	Job failed. See "journalctl -xe" for details.
	
	W0906 17:20:52.212655    7577 out.go:239] * 
	* 
	W0906 17:20:52.214097    7577 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0906 17:20:52.297682    7577 out.go:177] 

                                                
                                                
** /stderr **
start_stop_delete_test.go:188: failed starting minikube -first start-. args "out/minikube-darwin-amd64 start -p no-preload-343000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.28.1": exit status 90
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000: exit status 6 (139.451083ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 17:20:52.467207    7797 status.go:415] kubeconfig endpoint: extract IP: "no-preload-343000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-343000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestStartStop/group/no-preload/serial/FirstStart (15.54s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-343000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) Non-zero exit: kubectl --context no-preload-343000 create -f testdata/busybox.yaml: exit status 1 (35.007631ms)

                                                
                                                
** stderr ** 
	error: no openapi getter

                                                
                                                
** /stderr **
start_stop_delete_test.go:196: kubectl --context no-preload-343000 create -f testdata/busybox.yaml failed: exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000: exit status 6 (131.223842ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 17:20:52.635258    7803 status.go:415] kubeconfig endpoint: extract IP: "no-preload-343000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-343000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000: exit status 6 (135.712953ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 17:20:52.771442    7808 status.go:415] kubeconfig endpoint: extract IP: "no-preload-343000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-343000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestStartStop/group/no-preload/serial/DeployApp (0.30s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (105.62s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-343000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
E0906 17:20:55.856670    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:21:06.097126    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:21:18.295208    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.300295    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.311611    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.333783    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.375933    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.456249    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.616612    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:18.938820    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:19.580028    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:20.860932    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:23.421977    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:26.577875    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:21:28.543567    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:38.784242    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:21:59.264288    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:22:07.539609    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:22:11.840074    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:22:15.045378    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 17:22:15.045378    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 17:22:19.399432    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:19.405885    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:19.416785    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:19.437445    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:19.477923    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:19.558590    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:19.719522    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:20.039748    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:20.680548    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:21.962077    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:24.522780    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:29.644003    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
start_stop_delete_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-343000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: exit status 10 (1m45.452283065s)

                                                
                                                
-- stdout --
	* metrics-server is an addon maintained by Kubernetes. For any concerns contact minikube on GitHub.
	You can view the list of minikube maintainers at: https://github.com/kubernetes/minikube/blob/master/OWNERS
	  - Using image fake.domain/registry.k8s.io/echoserver:1.4
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_ADDON_ENABLE: enable failed: run callbacks: running callbacks: [sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.1/kubectl apply --force -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: Process exited with status 1
	stdout:
	
	stderr:
	sudo: /var/lib/minikube/binaries/v1.28.1/kubectl: command not found
	]
	* 
	╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                           │
	│    * If the above advice does not help, please let us know:                                                               │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                             │
	│                                                                                                                           │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                  │
	│    * Please also attach the following file to the GitHub issue:                                                           │
	│    * - /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube_addons_2bafae6fa40fec163538f94366e390b0317a8b15_0.log    │
	│                                                                                                                           │
	╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
start_stop_delete_test.go:207: failed to enable an addon post-stop. args "out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-343000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain": exit status 10
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-343000 describe deploy/metrics-server -n kube-system
start_stop_delete_test.go:215: (dbg) Non-zero exit: kubectl --context no-preload-343000 describe deploy/metrics-server -n kube-system: exit status 1 (34.780811ms)

                                                
                                                
** stderr ** 
	error: context "no-preload-343000" does not exist

                                                
                                                
** /stderr **
start_stop_delete_test.go:217: failed to get info on auto-pause deployments. args "kubectl --context no-preload-343000 describe deploy/metrics-server -n kube-system": exit status 1
start_stop_delete_test.go:221: addon did not load correct image. Expected to contain " fake.domain/registry.k8s.io/echoserver:1.4". Addon deployment info: 
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000: exit status 6 (131.862391ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0906 17:22:38.390804    7834 status.go:415] kubeconfig endpoint: extract IP: "no-preload-343000" does not appear in /Users/jenkins/minikube-integration/17174-977/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "no-preload-343000" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (105.62s)

                                                
                                    

Test pass (272/300)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 12.04
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.28
10 TestDownloadOnly/v1.28.1/json-events 6.65
11 TestDownloadOnly/v1.28.1/preload-exists 0
14 TestDownloadOnly/v1.28.1/kubectl 0
15 TestDownloadOnly/v1.28.1/LogsDuration 0.27
16 TestDownloadOnly/DeleteAll 0.37
17 TestDownloadOnly/DeleteAlwaysSucceeds 0.35
19 TestBinaryMirror 0.97
20 TestOffline 55.49
22 TestAddons/Setup 126.61
24 TestAddons/parallel/Registry 15
25 TestAddons/parallel/Ingress 23.51
26 TestAddons/parallel/InspektorGadget 10.49
27 TestAddons/parallel/MetricsServer 5.47
28 TestAddons/parallel/HelmTiller 10.36
30 TestAddons/parallel/CSI 49.51
31 TestAddons/parallel/Headlamp 13.24
32 TestAddons/parallel/CloudSpanner 5.4
35 TestAddons/serial/GCPAuth/Namespaces 0.09
36 TestAddons/StoppedEnableDisable 5.7
37 TestCertOptions 41.08
38 TestCertExpiration 242.38
39 TestDockerFlags 39.57
40 TestForceSystemdFlag 40.19
44 TestHyperKitDriverInstallOrUpdate 6.63
47 TestErrorSpam/setup 34.28
48 TestErrorSpam/start 1.53
49 TestErrorSpam/status 0.43
50 TestErrorSpam/pause 1.21
51 TestErrorSpam/unpause 1.22
52 TestErrorSpam/stop 3.61
55 TestFunctional/serial/CopySyncFile 0
56 TestFunctional/serial/StartWithProxy 51.66
57 TestFunctional/serial/AuditLog 0
58 TestFunctional/serial/SoftStart 40.83
59 TestFunctional/serial/KubeContext 0.03
60 TestFunctional/serial/KubectlGetPods 0.06
63 TestFunctional/serial/CacheCmd/cache/add_remote 4.42
64 TestFunctional/serial/CacheCmd/cache/add_local 1.4
65 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.06
66 TestFunctional/serial/CacheCmd/cache/list 0.06
67 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.16
68 TestFunctional/serial/CacheCmd/cache/cache_reload 1.34
69 TestFunctional/serial/CacheCmd/cache/delete 0.13
70 TestFunctional/serial/MinikubeKubectlCmd 0.53
71 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.73
72 TestFunctional/serial/ExtraConfig 40.42
73 TestFunctional/serial/ComponentHealth 0.05
74 TestFunctional/serial/LogsCmd 2.73
75 TestFunctional/serial/LogsFileCmd 2.59
76 TestFunctional/serial/InvalidService 4.35
78 TestFunctional/parallel/ConfigCmd 0.56
79 TestFunctional/parallel/DashboardCmd 13.78
80 TestFunctional/parallel/DryRun 1.03
81 TestFunctional/parallel/InternationalLanguage 0.45
82 TestFunctional/parallel/StatusCmd 0.53
86 TestFunctional/parallel/ServiceCmdConnect 11.37
87 TestFunctional/parallel/AddonsCmd 0.23
88 TestFunctional/parallel/PersistentVolumeClaim 26.27
90 TestFunctional/parallel/SSHCmd 0.37
91 TestFunctional/parallel/CpCmd 0.57
92 TestFunctional/parallel/MySQL 28.36
93 TestFunctional/parallel/FileSync 0.16
94 TestFunctional/parallel/CertSync 0.94
98 TestFunctional/parallel/NodeLabels 0.06
100 TestFunctional/parallel/NonActiveRuntimeDisabled 0.2
102 TestFunctional/parallel/License 0.55
103 TestFunctional/parallel/Version/short 0.12
104 TestFunctional/parallel/Version/components 0.44
105 TestFunctional/parallel/ImageCommands/ImageListShort 0.17
106 TestFunctional/parallel/ImageCommands/ImageListTable 0.17
107 TestFunctional/parallel/ImageCommands/ImageListJson 0.15
108 TestFunctional/parallel/ImageCommands/ImageListYaml 0.17
109 TestFunctional/parallel/ImageCommands/ImageBuild 2.49
110 TestFunctional/parallel/ImageCommands/Setup 2.3
111 TestFunctional/parallel/DockerEnv/bash 0.73
112 TestFunctional/parallel/UpdateContextCmd/no_changes 0.17
113 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.17
114 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.18
115 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.31
116 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.42
117 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 5.34
118 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.26
119 TestFunctional/parallel/ImageCommands/ImageRemove 0.34
120 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.3
121 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 1.4
122 TestFunctional/parallel/ProfileCmd/profile_not_create 0.31
123 TestFunctional/parallel/ProfileCmd/profile_list 0.27
124 TestFunctional/parallel/ProfileCmd/profile_json_output 0.27
126 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.41
127 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
129 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 11.23
130 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
131 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
132 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.03
133 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.02
134 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
135 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
136 TestFunctional/parallel/ServiceCmd/DeployApp 8.11
137 TestFunctional/parallel/ServiceCmd/List 0.76
138 TestFunctional/parallel/ServiceCmd/JSONOutput 0.76
139 TestFunctional/parallel/ServiceCmd/HTTPS 0.42
140 TestFunctional/parallel/ServiceCmd/Format 0.43
141 TestFunctional/parallel/ServiceCmd/URL 0.43
143 TestFunctional/parallel/MountCmd/specific-port 1.7
144 TestFunctional/parallel/MountCmd/VerifyCleanup 1.95
145 TestFunctional/delete_addon-resizer_images 0.13
146 TestFunctional/delete_my-image_image 0.05
147 TestFunctional/delete_minikube_cached_images 0.05
153 TestIngressAddonLegacy/StartLegacyK8sCluster 71.37
155 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 11.77
156 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.61
157 TestIngressAddonLegacy/serial/ValidateIngressAddons 38.26
160 TestJSONOutput/start/Command 51.39
161 TestJSONOutput/start/Audit 0
163 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
164 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
166 TestJSONOutput/pause/Command 0.45
167 TestJSONOutput/pause/Audit 0
169 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
170 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
172 TestJSONOutput/unpause/Command 0.41
173 TestJSONOutput/unpause/Audit 0
175 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
176 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
178 TestJSONOutput/stop/Command 8.16
179 TestJSONOutput/stop/Audit 0
181 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
182 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
183 TestErrorJSONOutput 0.7
188 TestMainNoArgs 0.06
192 TestMountStart/serial/StartWithMountFirst 16.24
193 TestMountStart/serial/VerifyMountFirst 0.28
194 TestMountStart/serial/StartWithMountSecond 16.19
195 TestMountStart/serial/VerifyMountSecond 0.28
196 TestMountStart/serial/DeleteFirst 2.41
197 TestMountStart/serial/VerifyMountPostDelete 0.28
198 TestMountStart/serial/Stop 2.21
199 TestMountStart/serial/RestartStopped 16.61
200 TestMountStart/serial/VerifyMountPostStop 0.28
203 TestMultiNode/serial/FreshStart2Nodes 97.2
204 TestMultiNode/serial/DeployApp2Nodes 4.45
205 TestMultiNode/serial/PingHostFrom2Pods 0.8
206 TestMultiNode/serial/AddNode 32.42
207 TestMultiNode/serial/ProfileList 0.18
208 TestMultiNode/serial/CopyFile 4.78
209 TestMultiNode/serial/StopNode 2.64
210 TestMultiNode/serial/StartAfterStop 27.22
211 TestMultiNode/serial/RestartKeepsNodes 198.31
212 TestMultiNode/serial/DeleteNode 2.98
213 TestMultiNode/serial/StopMultiNode 16.43
214 TestMultiNode/serial/RestartMultiNode 99.52
215 TestMultiNode/serial/ValidateNameConflict 44.56
219 TestPreload 142.72
221 TestScheduledStopUnix 105.7
222 TestSkaffold 107.8
225 TestRunningBinaryUpgrade 169.56
227 TestKubernetesUpgrade 164.1
240 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 3.58
241 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.67
242 TestStoppedBinaryUpgrade/Setup 0.49
243 TestStoppedBinaryUpgrade/Upgrade 154.18
246 TestStoppedBinaryUpgrade/MinikubeLogs 2.7
255 TestNoKubernetes/serial/StartNoK8sWithVersion 0.41
256 TestNoKubernetes/serial/StartWithK8s 37.8
257 TestNetworkPlugins/group/auto/Start 59.25
258 TestNoKubernetes/serial/StartWithStopK8s 7.69
259 TestNoKubernetes/serial/Start 16.21
260 TestNoKubernetes/serial/VerifyK8sNotRunning 0.12
261 TestNoKubernetes/serial/ProfileList 0.51
262 TestNoKubernetes/serial/Stop 2.25
263 TestNetworkPlugins/group/auto/KubeletFlags 0.14
264 TestNetworkPlugins/group/auto/NetCatPod 10.23
265 TestNoKubernetes/serial/StartNoArgs 15.49
266 TestNetworkPlugins/group/auto/DNS 0.14
267 TestNetworkPlugins/group/auto/Localhost 0.11
268 TestNetworkPlugins/group/auto/HairPin 0.11
269 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.12
270 TestNetworkPlugins/group/kindnet/Start 58.38
271 TestNetworkPlugins/group/calico/Start 82.83
272 TestNetworkPlugins/group/kindnet/ControllerPod 5.01
273 TestNetworkPlugins/group/kindnet/KubeletFlags 0.14
274 TestNetworkPlugins/group/kindnet/NetCatPod 10.21
275 TestNetworkPlugins/group/kindnet/DNS 0.14
276 TestNetworkPlugins/group/kindnet/Localhost 0.13
277 TestNetworkPlugins/group/kindnet/HairPin 0.13
278 TestNetworkPlugins/group/calico/ControllerPod 5.02
279 TestNetworkPlugins/group/custom-flannel/Start 60.68
280 TestNetworkPlugins/group/calico/KubeletFlags 0.15
281 TestNetworkPlugins/group/calico/NetCatPod 10.22
282 TestNetworkPlugins/group/calico/DNS 0.18
283 TestNetworkPlugins/group/calico/Localhost 0.15
284 TestNetworkPlugins/group/calico/HairPin 0.14
285 TestNetworkPlugins/group/false/Start 89.78
286 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.16
287 TestNetworkPlugins/group/custom-flannel/NetCatPod 10.27
288 TestNetworkPlugins/group/custom-flannel/DNS 0.13
289 TestNetworkPlugins/group/custom-flannel/Localhost 0.1
290 TestNetworkPlugins/group/custom-flannel/HairPin 0.1
291 TestNetworkPlugins/group/enable-default-cni/Start 49.1
292 TestNetworkPlugins/group/false/KubeletFlags 0.14
293 TestNetworkPlugins/group/false/NetCatPod 11.18
294 TestNetworkPlugins/group/false/DNS 0.17
295 TestNetworkPlugins/group/false/Localhost 0.14
296 TestNetworkPlugins/group/false/HairPin 0.12
297 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.14
298 TestNetworkPlugins/group/enable-default-cni/NetCatPod 11.18
299 TestNetworkPlugins/group/enable-default-cni/DNS 0.13
300 TestNetworkPlugins/group/enable-default-cni/Localhost 0.1
301 TestNetworkPlugins/group/enable-default-cni/HairPin 0.11
303 TestNetworkPlugins/group/bridge/Start 49.49
304 TestNetworkPlugins/group/kubenet/Start 49.15
305 TestNetworkPlugins/group/bridge/KubeletFlags 0.14
306 TestNetworkPlugins/group/bridge/NetCatPod 10.18
307 TestNetworkPlugins/group/bridge/DNS 0.13
308 TestNetworkPlugins/group/bridge/Localhost 0.1
309 TestNetworkPlugins/group/bridge/HairPin 0.11
310 TestNetworkPlugins/group/kubenet/KubeletFlags 0.15
311 TestNetworkPlugins/group/kubenet/NetCatPod 9.24
312 TestNetworkPlugins/group/kubenet/DNS 0.12
313 TestNetworkPlugins/group/kubenet/Localhost 0.1
314 TestNetworkPlugins/group/kubenet/HairPin 0.1
316 TestStartStop/group/old-k8s-version/serial/FirstStart 131.54
321 TestStartStop/group/old-k8s-version/serial/DeployApp 8.3
322 TestStartStop/group/no-preload/serial/Stop 8.27
323 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.76
324 TestStartStop/group/old-k8s-version/serial/Stop 8.22
325 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.29
326 TestStartStop/group/no-preload/serial/SecondStart 61.15
327 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.28
328 TestStartStop/group/old-k8s-version/serial/SecondStart 510.68
329 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 15.01
330 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.08
331 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.16
332 TestStartStop/group/no-preload/serial/Pause 1.79
334 TestStartStop/group/embed-certs/serial/FirstStart 49.12
335 TestStartStop/group/embed-certs/serial/DeployApp 8.27
336 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.85
337 TestStartStop/group/embed-certs/serial/Stop 8.25
338 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.29
339 TestStartStop/group/embed-certs/serial/SecondStart 296.92
340 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 5.02
341 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.06
342 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.16
343 TestStartStop/group/embed-certs/serial/Pause 1.8
345 TestStartStop/group/default-k8s-diff-port/serial/FirstStart 88.18
346 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 5.02
347 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.06
348 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.16
349 TestStartStop/group/old-k8s-version/serial/Pause 1.65
351 TestStartStop/group/newest-cni/serial/FirstStart 48.26
352 TestStartStop/group/default-k8s-diff-port/serial/DeployApp 8.27
353 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive 0.83
354 TestStartStop/group/default-k8s-diff-port/serial/Stop 8.25
355 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop 0.31
356 TestStartStop/group/default-k8s-diff-port/serial/SecondStart 296.64
357 TestStartStop/group/newest-cni/serial/DeployApp 0
358 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 0.91
359 TestStartStop/group/newest-cni/serial/Stop 8.25
360 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.29
361 TestStartStop/group/newest-cni/serial/SecondStart 39.4
362 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
363 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
364 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.17
365 TestStartStop/group/newest-cni/serial/Pause 1.79
366 TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop 5.01
367 TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop 5.06
368 TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages 0.16
369 TestStartStop/group/default-k8s-diff-port/serial/Pause 1.8
x
+
TestDownloadOnly/v1.16.0/json-events (12.04s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-113000 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-113000 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (12.037776225s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (12.04s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.28s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-113000
aaa_download_only_test.go:169: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-113000: exit status 85 (275.569434ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-113000 | jenkins | v1.31.2 | 06 Sep 23 16:36 PDT |          |
	|         | -p download-only-113000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2023/09/06 16:36:55
	Running on machine: MacOS-Agent-1
	Binary: Built with gc go1.20.7 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 16:36:55.084916    1440 out.go:296] Setting OutFile to fd 1 ...
	I0906 16:36:55.085119    1440 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:36:55.085126    1440 out.go:309] Setting ErrFile to fd 2...
	I0906 16:36:55.085129    1440 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:36:55.085303    1440 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	W0906 16:36:55.085395    1440 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/17174-977/.minikube/config/config.json: open /Users/jenkins/minikube-integration/17174-977/.minikube/config/config.json: no such file or directory
	I0906 16:36:55.086981    1440 out.go:303] Setting JSON to true
	I0906 16:36:55.107982    1440 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":388,"bootTime":1694043027,"procs":393,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 16:36:55.108075    1440 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 16:36:55.130381    1440 out.go:97] [download-only-113000] minikube v1.31.2 on Darwin 13.5.1
	I0906 16:36:55.130643    1440 notify.go:220] Checking for updates...
	W0906 16:36:55.130677    1440 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball: no such file or directory
	I0906 16:36:55.152296    1440 out.go:169] MINIKUBE_LOCATION=17174
	I0906 16:36:55.174574    1440 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 16:36:55.196279    1440 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 16:36:55.217467    1440 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 16:36:55.239541    1440 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	W0906 16:36:55.283506    1440 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0906 16:36:55.283967    1440 driver.go:373] Setting default libvirt URI to qemu:///system
	I0906 16:36:55.377361    1440 out.go:97] Using the hyperkit driver based on user configuration
	I0906 16:36:55.377396    1440 start.go:298] selected driver: hyperkit
	I0906 16:36:55.377409    1440 start.go:902] validating driver "hyperkit" against <nil>
	I0906 16:36:55.377624    1440 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 16:36:55.377972    1440 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17174-977/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 16:36:55.520719    1440 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.31.2
	I0906 16:36:55.525077    1440 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:36:55.525098    1440 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 16:36:55.525124    1440 start_flags.go:307] no existing cluster config was found, will generate one from the flags 
	I0906 16:36:55.530145    1440 start_flags.go:384] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0906 16:36:55.530299    1440 start_flags.go:904] Wait components to verify : map[apiserver:true system_pods:true]
	I0906 16:36:55.530328    1440 cni.go:84] Creating CNI manager for ""
	I0906 16:36:55.530342    1440 cni.go:162] CNI unnecessary in this configuration, recommending no CNI
	I0906 16:36:55.530350    1440 start_flags.go:321] config:
	{Name:download-only-113000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-113000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Container
Runtime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 16:36:55.530596    1440 iso.go:125] acquiring lock: {Name:mk785f5a651fb55e13065a70647b69ec2c0160e8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 16:36:55.552168    1440 out.go:97] Downloading VM boot image ...
	I0906 16:36:55.552258    1440 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/iso/amd64/minikube-v1.31.0-1692872107-17120-amd64.iso
	I0906 16:36:59.734317    1440 out.go:97] Starting control plane node download-only-113000 in cluster download-only-113000
	I0906 16:36:59.734364    1440 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0906 16:36:59.793646    1440 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0906 16:36:59.793677    1440 cache.go:57] Caching tarball of preloaded images
	I0906 16:36:59.794002    1440 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0906 16:36:59.814865    1440 out.go:97] Downloading Kubernetes v1.16.0 preload ...
	I0906 16:36:59.814975    1440 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0906 16:36:59.898924    1440 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-113000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:170: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.28s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.1/json-events (6.65s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.1/json-events
aaa_download_only_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-113000 --force --alsologtostderr --kubernetes-version=v1.28.1 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-113000 --force --alsologtostderr --kubernetes-version=v1.28.1 --container-runtime=docker --driver=hyperkit : (6.646812557s)
--- PASS: TestDownloadOnly/v1.28.1/json-events (6.65s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.1/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.1/preload-exists
--- PASS: TestDownloadOnly/v1.28.1/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.1/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.1/kubectl
--- PASS: TestDownloadOnly/v1.28.1/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.1/LogsDuration (0.27s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.1/LogsDuration
aaa_download_only_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-113000
aaa_download_only_test.go:169: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-113000: exit status 85 (273.137419ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-113000 | jenkins | v1.31.2 | 06 Sep 23 16:36 PDT |          |
	|         | -p download-only-113000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	| start   | -o=json --download-only        | download-only-113000 | jenkins | v1.31.2 | 06 Sep 23 16:37 PDT |          |
	|         | -p download-only-113000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.28.1   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2023/09/06 16:37:07
	Running on machine: MacOS-Agent-1
	Binary: Built with gc go1.20.7 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 16:37:07.402942    1457 out.go:296] Setting OutFile to fd 1 ...
	I0906 16:37:07.403120    1457 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:37:07.403127    1457 out.go:309] Setting ErrFile to fd 2...
	I0906 16:37:07.403131    1457 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:37:07.403306    1457 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	W0906 16:37:07.403396    1457 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/17174-977/.minikube/config/config.json: open /Users/jenkins/minikube-integration/17174-977/.minikube/config/config.json: no such file or directory
	I0906 16:37:07.404597    1457 out.go:303] Setting JSON to true
	I0906 16:37:07.423970    1457 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":400,"bootTime":1694043027,"procs":398,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 16:37:07.424043    1457 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 16:37:07.445064    1457 out.go:97] [download-only-113000] minikube v1.31.2 on Darwin 13.5.1
	I0906 16:37:07.445216    1457 notify.go:220] Checking for updates...
	I0906 16:37:07.465904    1457 out.go:169] MINIKUBE_LOCATION=17174
	I0906 16:37:07.487073    1457 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 16:37:07.507978    1457 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 16:37:07.528957    1457 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 16:37:07.550621    1457 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	W0906 16:37:07.593093    1457 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0906 16:37:07.593633    1457 config.go:182] Loaded profile config "download-only-113000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	W0906 16:37:07.593701    1457 start.go:810] api.Load failed for download-only-113000: filestore "download-only-113000": Docker machine "download-only-113000" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I0906 16:37:07.593836    1457 driver.go:373] Setting default libvirt URI to qemu:///system
	W0906 16:37:07.593868    1457 start.go:810] api.Load failed for download-only-113000: filestore "download-only-113000": Docker machine "download-only-113000" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I0906 16:37:07.622212    1457 out.go:97] Using the hyperkit driver based on existing profile
	I0906 16:37:07.622277    1457 start.go:298] selected driver: hyperkit
	I0906 16:37:07.622288    1457 start.go:902] validating driver "hyperkit" against &{Name:download-only-113000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kuber
netesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-113000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirro
r: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 16:37:07.622607    1457 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 16:37:07.622791    1457 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/17174-977/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 16:37:07.630957    1457 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.31.2
	I0906 16:37:07.634370    1457 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:37:07.634396    1457 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 16:37:07.636706    1457 cni.go:84] Creating CNI manager for ""
	I0906 16:37:07.636727    1457 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0906 16:37:07.636743    1457 start_flags.go:321] config:
	{Name:download-only-113000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.1 ClusterName:download-only-113000 Namespace:
default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketV
MnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 16:37:07.636890    1457 iso.go:125] acquiring lock: {Name:mk785f5a651fb55e13065a70647b69ec2c0160e8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 16:37:07.658215    1457 out.go:97] Starting control plane node download-only-113000 in cluster download-only-113000
	I0906 16:37:07.658249    1457 preload.go:132] Checking if preload exists for k8s version v1.28.1 and runtime docker
	I0906 16:37:07.715291    1457 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.1/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4
	I0906 16:37:07.715349    1457 cache.go:57] Caching tarball of preloaded images
	I0906 16:37:07.715720    1457 preload.go:132] Checking if preload exists for k8s version v1.28.1 and runtime docker
	I0906 16:37:07.737078    1457 out.go:97] Downloading Kubernetes v1.28.1 preload ...
	I0906 16:37:07.737158    1457 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4 ...
	I0906 16:37:07.851406    1457 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.28.1/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4?checksum=md5:e86539672b8ce9a3040455131c2fbb87 -> /Users/jenkins/minikube-integration/17174-977/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.1-docker-overlay2-amd64.tar.lz4
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-113000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:170: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.1/LogsDuration (0.27s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAll (0.37s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAll
aaa_download_only_test.go:187: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/DeleteAll (0.37s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAlwaysSucceeds (0.35s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAlwaysSucceeds
aaa_download_only_test.go:199: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-113000
--- PASS: TestDownloadOnly/DeleteAlwaysSucceeds (0.35s)

                                                
                                    
x
+
TestBinaryMirror (0.97s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-308000 --alsologtostderr --binary-mirror http://127.0.0.1:49390 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-308000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-308000
--- PASS: TestBinaryMirror (0.97s)

                                                
                                    
x
+
TestOffline (55.49s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-615000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-615000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (50.217879613s)
helpers_test.go:175: Cleaning up "offline-docker-615000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-615000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-615000: (5.274612975s)
--- PASS: TestOffline (55.49s)

                                                
                                    
x
+
TestAddons/Setup (126.61s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:88: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-720000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:88: (dbg) Done: out/minikube-darwin-amd64 start -p addons-720000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (2m6.608410365s)
--- PASS: TestAddons/Setup (126.61s)

                                                
                                    
x
+
TestAddons/parallel/Registry (15s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:306: registry stabilized in 11.690978ms
addons_test.go:308: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-5jjz6" [49ddf590-2a1d-4fd4-8321-16eb56b85181] Running
addons_test.go:308: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.009803048s
addons_test.go:311: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-5kppw" [6ae5a636-a280-40cc-aa11-9bfdae5954b8] Running
addons_test.go:311: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.008360017s
addons_test.go:316: (dbg) Run:  kubectl --context addons-720000 delete po -l run=registry-test --now
addons_test.go:321: (dbg) Run:  kubectl --context addons-720000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:321: (dbg) Done: kubectl --context addons-720000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.356617366s)
addons_test.go:335: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 ip
2023/09/06 16:39:37 [DEBUG] GET http://192.168.64.2:5000
addons_test.go:364: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (15.00s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (23.51s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:183: (dbg) Run:  kubectl --context addons-720000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:208: (dbg) Run:  kubectl --context addons-720000 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:221: (dbg) Run:  kubectl --context addons-720000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:226: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [91f070d3-104d-49ee-8d9c-813e4f45d917] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [91f070d3-104d-49ee-8d9c-813e4f45d917] Running
addons_test.go:226: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 13.012146245s
addons_test.go:238: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:262: (dbg) Run:  kubectl --context addons-720000 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:267: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 ip
addons_test.go:273: (dbg) Run:  nslookup hello-john.test 192.168.64.2
addons_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p addons-720000 addons disable ingress-dns --alsologtostderr -v=1: (1.69056437s)
addons_test.go:287: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable ingress --alsologtostderr -v=1
addons_test.go:287: (dbg) Done: out/minikube-darwin-amd64 -p addons-720000 addons disable ingress --alsologtostderr -v=1: (7.503854312s)
--- PASS: TestAddons/parallel/Ingress (23.51s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.49s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:814: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-6fzr5" [8f4f5594-720b-46f2-99a5-184ddec069a4] Running
addons_test.go:814: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.008322572s
addons_test.go:817: (dbg) Run:  out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-720000
addons_test.go:817: (dbg) Done: out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-720000: (5.478108098s)
--- PASS: TestAddons/parallel/InspektorGadget (10.49s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.47s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:383: metrics-server stabilized in 2.123661ms
addons_test.go:385: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-7c66d45ddc-w6282" [c6faf617-405f-489a-b798-b98c6826e714] Running
addons_test.go:385: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.009861766s
addons_test.go:391: (dbg) Run:  kubectl --context addons-720000 top pods -n kube-system
addons_test.go:408: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.47s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (10.36s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:432: tiller-deploy stabilized in 2.368501ms
addons_test.go:434: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-7b677967b9-qh9bk" [991063fa-dcb0-4a05-a54d-68fe4ae8a05e] Running
addons_test.go:434: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.013018481s
addons_test.go:449: (dbg) Run:  kubectl --context addons-720000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:449: (dbg) Done: kubectl --context addons-720000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (4.948965224s)
addons_test.go:466: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (10.36s)

                                                
                                    
x
+
TestAddons/parallel/CSI (49.51s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:537: csi-hostpath-driver pods stabilized in 6.921769ms
addons_test.go:540: (dbg) Run:  kubectl --context addons-720000 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:545: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:550: (dbg) Run:  kubectl --context addons-720000 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:555: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [808a9380-8d79-4076-90b0-f847c8cb1438] Pending
helpers_test.go:344: "task-pv-pod" [808a9380-8d79-4076-90b0-f847c8cb1438] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [808a9380-8d79-4076-90b0-f847c8cb1438] Running
addons_test.go:555: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 13.014069718s
addons_test.go:560: (dbg) Run:  kubectl --context addons-720000 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:565: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-720000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:419: (dbg) Run:  kubectl --context addons-720000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:419: (dbg) Run:  kubectl --context addons-720000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:570: (dbg) Run:  kubectl --context addons-720000 delete pod task-pv-pod
addons_test.go:570: (dbg) Done: kubectl --context addons-720000 delete pod task-pv-pod: (1.11225769s)
addons_test.go:576: (dbg) Run:  kubectl --context addons-720000 delete pvc hpvc
addons_test.go:582: (dbg) Run:  kubectl --context addons-720000 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:587: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-720000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:592: (dbg) Run:  kubectl --context addons-720000 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:597: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [f4c8c91d-2759-4bd0-b7e7-7f50af8f03e1] Pending
helpers_test.go:344: "task-pv-pod-restore" [f4c8c91d-2759-4bd0-b7e7-7f50af8f03e1] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [f4c8c91d-2759-4bd0-b7e7-7f50af8f03e1] Running
addons_test.go:597: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 11.025238523s
addons_test.go:602: (dbg) Run:  kubectl --context addons-720000 delete pod task-pv-pod-restore
addons_test.go:606: (dbg) Run:  kubectl --context addons-720000 delete pvc hpvc-restore
addons_test.go:610: (dbg) Run:  kubectl --context addons-720000 delete volumesnapshot new-snapshot-demo
addons_test.go:614: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:614: (dbg) Done: out/minikube-darwin-amd64 -p addons-720000 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.362633645s)
addons_test.go:618: (dbg) Run:  out/minikube-darwin-amd64 -p addons-720000 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (49.51s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (13.24s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:800: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-720000 --alsologtostderr -v=1
addons_test.go:800: (dbg) Done: out/minikube-darwin-amd64 addons enable headlamp -p addons-720000 --alsologtostderr -v=1: (1.225168103s)
addons_test.go:805: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-699c48fb74-gh47f" [593ca5f2-3846-4d64-9dcc-c6aa895e813a] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-699c48fb74-gh47f" [593ca5f2-3846-4d64-9dcc-c6aa895e813a] Running
addons_test.go:805: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 12.01005938s
--- PASS: TestAddons/parallel/Headlamp (13.24s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.4s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:833: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-6dcc56475c-hrrhl" [d6e05ae9-5973-4c35-b70b-db26f9741421] Running
addons_test.go:833: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.008248124s
addons_test.go:836: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-720000
--- PASS: TestAddons/parallel/CloudSpanner (5.40s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.09s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:626: (dbg) Run:  kubectl --context addons-720000 create ns new-namespace
addons_test.go:640: (dbg) Run:  kubectl --context addons-720000 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.09s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (5.7s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:148: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-720000
addons_test.go:148: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-720000: (5.222917825s)
addons_test.go:152: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-720000
addons_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-720000
addons_test.go:161: (dbg) Run:  out/minikube-darwin-amd64 addons disable gvisor -p addons-720000
--- PASS: TestAddons/StoppedEnableDisable (5.70s)

                                                
                                    
x
+
TestCertOptions (41.08s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-594000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 
E0906 17:07:15.057002    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-594000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (35.472163163s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-594000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-594000 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-594000 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-594000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-594000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-594000: (5.289943291s)
--- PASS: TestCertOptions (41.08s)

                                                
                                    
x
+
TestCertExpiration (242.38s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-063000 --memory=2048 --cert-expiration=3m --driver=hyperkit 
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-063000 --memory=2048 --cert-expiration=3m --driver=hyperkit : (34.837056059s)
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-063000 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-063000 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (22.296685118s)
helpers_test.go:175: Cleaning up "cert-expiration-063000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-063000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-063000: (5.241443259s)
--- PASS: TestCertExpiration (242.38s)

                                                
                                    
x
+
TestDockerFlags (39.57s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-723000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:51: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-723000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (35.790842809s)
docker_test.go:56: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-723000 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-723000 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-723000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-723000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-723000: (3.479872576s)
--- PASS: TestDockerFlags (39.57s)

                                                
                                    
x
+
TestForceSystemdFlag (40.19s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-322000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:91: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-322000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (34.723254988s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-322000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-322000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-322000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-322000: (5.297524731s)
--- PASS: TestForceSystemdFlag (40.19s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (6.63s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (6.63s)

                                                
                                    
x
+
TestErrorSpam/setup (34.28s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-868000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-868000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 --driver=hyperkit : (34.280907371s)
--- PASS: TestErrorSpam/setup (34.28s)

                                                
                                    
x
+
TestErrorSpam/start (1.53s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 start --dry-run
--- PASS: TestErrorSpam/start (1.53s)

                                                
                                    
x
+
TestErrorSpam/status (0.43s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 status
--- PASS: TestErrorSpam/status (0.43s)

                                                
                                    
x
+
TestErrorSpam/pause (1.21s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 pause
--- PASS: TestErrorSpam/pause (1.21s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.22s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 unpause
--- PASS: TestErrorSpam/unpause (1.22s)

                                                
                                    
x
+
TestErrorSpam/stop (3.61s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 stop: (3.199692839s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 stop
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-868000 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-868000 stop
--- PASS: TestErrorSpam/stop (3.61s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1851: local sync path: /Users/jenkins/minikube-integration/17174-977/.minikube/files/etc/test/nested/copy/1438/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (51.66s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2230: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-283000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
functional_test.go:2230: (dbg) Done: out/minikube-darwin-amd64 start -p functional-283000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (51.660226807s)
--- PASS: TestFunctional/serial/StartWithProxy (51.66s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (40.83s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-283000 --alsologtostderr -v=8
functional_test.go:655: (dbg) Done: out/minikube-darwin-amd64 start -p functional-283000 --alsologtostderr -v=8: (40.832539765s)
functional_test.go:659: soft start took 40.833010774s for "functional-283000" cluster.
--- PASS: TestFunctional/serial/SoftStart (40.83s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.03s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.03s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-283000 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (4.42s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 cache add registry.k8s.io/pause:3.1: (1.498136467s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 cache add registry.k8s.io/pause:3.3: (1.506898507s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cache add registry.k8s.io/pause:latest
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 cache add registry.k8s.io/pause:latest: (1.414687555s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (4.42s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.4s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalserialCacheCmdcacheadd_local2228911085/001
functional_test.go:1085: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cache add minikube-local-cache-test:functional-283000
functional_test.go:1090: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cache delete minikube-local-cache-test:functional-283000
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-283000
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.40s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.06s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.16s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.34s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh sudo docker rmi registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (138.869531ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cache reload
functional_test.go:1159: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.34s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.53s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 kubectl -- --context functional-283000 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.53s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.73s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-283000 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.73s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (40.42s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-283000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
functional_test.go:753: (dbg) Done: out/minikube-darwin-amd64 start -p functional-283000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (40.423705468s)
functional_test.go:757: restart took 40.423865748s for "functional-283000" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (40.42s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-283000 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.73s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 logs
functional_test.go:1232: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 logs: (2.726941022s)
--- PASS: TestFunctional/serial/LogsCmd (2.73s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.59s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 logs --file /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalserialLogsFileCmd131965357/001/logs.txt
functional_test.go:1246: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 logs --file /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalserialLogsFileCmd131965357/001/logs.txt: (2.588462828s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.59s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.35s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2317: (dbg) Run:  kubectl --context functional-283000 apply -f testdata/invalidsvc.yaml
functional_test.go:2331: (dbg) Run:  out/minikube-darwin-amd64 service invalid-svc -p functional-283000
functional_test.go:2331: (dbg) Non-zero exit: out/minikube-darwin-amd64 service invalid-svc -p functional-283000: exit status 115 (253.463331ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|---------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |            URL            |
	|-----------|-------------|-------------|---------------------------|
	| default   | invalid-svc |          80 | http://192.168.64.4:31807 |
	|-----------|-------------|-------------|---------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                            │
	│    * If the above advice does not help, please let us know:                                                                │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                              │
	│                                                                                                                            │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                   │
	│    * Please also attach the following file to the GitHub issue:                                                            │
	│    * - /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log    │
	│                                                                                                                            │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2323: (dbg) Run:  kubectl --context functional-283000 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.35s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 config get cpus: exit status 14 (40.190429ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 config get cpus: exit status 14 (70.566732ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.56s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (13.78s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-283000 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-283000 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 2516: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (13.78s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-283000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:970: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-283000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (531.302511ms)

                                                
                                                
-- stdout --
	* [functional-283000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 16:45:07.009118    2486 out.go:296] Setting OutFile to fd 1 ...
	I0906 16:45:07.009284    2486 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:45:07.009290    2486 out.go:309] Setting ErrFile to fd 2...
	I0906 16:45:07.009294    2486 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:45:07.009466    2486 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 16:45:07.010741    2486 out.go:303] Setting JSON to false
	I0906 16:45:07.030374    2486 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":880,"bootTime":1694043027,"procs":430,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 16:45:07.030443    2486 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 16:45:07.054712    2486 out.go:177] * [functional-283000] minikube v1.31.2 on Darwin 13.5.1
	I0906 16:45:07.095268    2486 notify.go:220] Checking for updates...
	I0906 16:45:07.137327    2486 out.go:177]   - MINIKUBE_LOCATION=17174
	I0906 16:45:07.200233    2486 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 16:45:07.221211    2486 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 16:45:07.242141    2486 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 16:45:07.263255    2486 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 16:45:07.284323    2486 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0906 16:45:07.305605    2486 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 16:45:07.306066    2486 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:45:07.306117    2486 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:45:07.313332    2486 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50419
	I0906 16:45:07.313688    2486 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:45:07.314116    2486 main.go:141] libmachine: Using API Version  1
	I0906 16:45:07.314127    2486 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:45:07.314330    2486 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:45:07.314436    2486 main.go:141] libmachine: (functional-283000) Calling .DriverName
	I0906 16:45:07.314591    2486 driver.go:373] Setting default libvirt URI to qemu:///system
	I0906 16:45:07.314828    2486 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:45:07.314854    2486 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:45:07.321568    2486 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50421
	I0906 16:45:07.321896    2486 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:45:07.322220    2486 main.go:141] libmachine: Using API Version  1
	I0906 16:45:07.322230    2486 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:45:07.322461    2486 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:45:07.322566    2486 main.go:141] libmachine: (functional-283000) Calling .DriverName
	I0906 16:45:07.350235    2486 out.go:177] * Using the hyperkit driver based on existing profile
	I0906 16:45:07.392220    2486 start.go:298] selected driver: hyperkit
	I0906 16:45:07.392231    2486 start.go:902] validating driver "hyperkit" against &{Name:functional-283000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.1 ClusterName:functional-283000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.4 Port:8441 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDis
ks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 16:45:07.392333    2486 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 16:45:07.416294    2486 out.go:177] 
	W0906 16:45:07.437321    2486 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0906 16:45:07.458095    2486 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-283000 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.03s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-283000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-283000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (448.6107ms)

                                                
                                                
-- stdout --
	* [functional-283000] minikube v1.31.2 sur Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 16:45:03.296870    2438 out.go:296] Setting OutFile to fd 1 ...
	I0906 16:45:03.297027    2438 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:45:03.297033    2438 out.go:309] Setting ErrFile to fd 2...
	I0906 16:45:03.297037    2438 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:45:03.297233    2438 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 16:45:03.298721    2438 out.go:303] Setting JSON to false
	I0906 16:45:03.318371    2438 start.go:128] hostinfo: {"hostname":"MacOS-Agent-1.local","uptime":876,"bootTime":1694043027,"procs":404,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"13.5.1","kernelVersion":"22.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0906 16:45:03.318706    2438 start.go:136] gopshost.Virtualization returned error: not implemented yet
	I0906 16:45:03.340510    2438 out.go:177] * [functional-283000] minikube v1.31.2 sur Darwin 13.5.1
	I0906 16:45:03.388883    2438 out.go:177]   - MINIKUBE_LOCATION=17174
	I0906 16:45:03.388942    2438 notify.go:220] Checking for updates...
	I0906 16:45:03.431036    2438 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	I0906 16:45:03.452140    2438 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 16:45:03.473078    2438 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 16:45:03.493796    2438 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	I0906 16:45:03.515022    2438 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0906 16:45:03.536726    2438 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 16:45:03.537425    2438 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:45:03.537501    2438 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:45:03.545109    2438 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50350
	I0906 16:45:03.545445    2438 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:45:03.545888    2438 main.go:141] libmachine: Using API Version  1
	I0906 16:45:03.545907    2438 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:45:03.546157    2438 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:45:03.546278    2438 main.go:141] libmachine: (functional-283000) Calling .DriverName
	I0906 16:45:03.546440    2438 driver.go:373] Setting default libvirt URI to qemu:///system
	I0906 16:45:03.546688    2438 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:45:03.546713    2438 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:45:03.553387    2438 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50352
	I0906 16:45:03.553709    2438 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:45:03.554078    2438 main.go:141] libmachine: Using API Version  1
	I0906 16:45:03.554100    2438 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:45:03.554299    2438 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:45:03.554400    2438 main.go:141] libmachine: (functional-283000) Calling .DriverName
	I0906 16:45:03.581131    2438 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0906 16:45:03.623179    2438 start.go:298] selected driver: hyperkit
	I0906 16:45:03.623204    2438 start.go:902] validating driver "hyperkit" against &{Name:functional-283000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/17120/minikube-v1.31.0-1692872107-17120-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.40-1693938323-17174@sha256:4edc55cb1933a7155ece55408f8b4aebfd99e28fa2209bc82b369d8ca3bf525b Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 Kubernet
esConfig:{KubernetesVersion:v1.28.1 ClusterName:functional-283000 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.4 Port:8441 KubernetesVersion:v1.28.1 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDis
ks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 AutoPauseInterval:1m0s}
	I0906 16:45:03.623406    2438 start.go:913] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 16:45:03.647895    2438 out.go:177] 
	W0906 16:45:03.669029    2438 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0906 16:45:03.690135    2438 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.53s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 status
functional_test.go:856: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.53s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (11.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1628: (dbg) Run:  kubectl --context functional-283000 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1634: (dbg) Run:  kubectl --context functional-283000 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1639: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-55497b8b78-cwq59" [6f77ab05-2893-4dbe-ad70-19fce4eee7ed] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
E0906 16:44:43.415543    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
helpers_test.go:344: "hello-node-connect-55497b8b78-cwq59" [6f77ab05-2893-4dbe-ad70-19fce4eee7ed] Running
functional_test.go:1639: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 11.010853395s
functional_test.go:1648: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 service hello-node-connect --url
functional_test.go:1654: found endpoint for hello-node-connect: http://192.168.64.4:30354
functional_test.go:1674: http://192.168.64.4:30354: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-55497b8b78-cwq59

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.64.4:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.64.4:30354
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (11.37s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1689: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 addons list
functional_test.go:1701: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (26.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [8f3726b5-43c2-4f6e-acfd-4587dd470945] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.009952457s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-283000 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-283000 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-283000 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-283000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [5f0f5c86-3b7a-4605-80cb-0259318b0bd6] Pending
helpers_test.go:344: "sp-pod" [5f0f5c86-3b7a-4605-80cb-0259318b0bd6] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [5f0f5c86-3b7a-4605-80cb-0259318b0bd6] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 12.010049947s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-283000 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-283000 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-283000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [d4dfc756-5ba0-4e01-aa8a-4389ac6375b6] Pending
helpers_test.go:344: "sp-pod" [d4dfc756-5ba0-4e01-aa8a-4389ac6375b6] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [d4dfc756-5ba0-4e01-aa8a-4389ac6375b6] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.009084762s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-283000 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (26.27s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1724: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "echo hello"
functional_test.go:1741: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.57s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh -n functional-283000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 cp functional-283000:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelCpCmd3127513460/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh -n functional-283000 "sudo cat /home/docker/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.57s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (28.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1789: (dbg) Run:  kubectl --context functional-283000 replace --force -f testdata/mysql.yaml
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-859648c796-ln7sb" [ce0a9027-5aa6-47f1-9804-212b5fa2676f] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-859648c796-ln7sb" [ce0a9027-5aa6-47f1-9804-212b5fa2676f] Running
E0906 16:44:33.175033    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 25.023566169s
functional_test.go:1803: (dbg) Run:  kubectl --context functional-283000 exec mysql-859648c796-ln7sb -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-283000 exec mysql-859648c796-ln7sb -- mysql -ppassword -e "show databases;": exit status 1 (143.062875ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-283000 exec mysql-859648c796-ln7sb -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-283000 exec mysql-859648c796-ln7sb -- mysql -ppassword -e "show databases;": exit status 1 (109.256494ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-283000 exec mysql-859648c796-ln7sb -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (28.36s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1925: Checking for existence of /etc/test/nested/copy/1438/hosts within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /etc/test/nested/copy/1438/hosts"
functional_test.go:1932: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (0.94s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1968: Checking for existence of /etc/ssl/certs/1438.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /etc/ssl/certs/1438.pem"
functional_test.go:1968: Checking for existence of /usr/share/ca-certificates/1438.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /usr/share/ca-certificates/1438.pem"
functional_test.go:1968: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/14382.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /etc/ssl/certs/14382.pem"
functional_test.go:1995: Checking for existence of /usr/share/ca-certificates/14382.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /usr/share/ca-certificates/14382.pem"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (0.94s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-283000 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2023: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo systemctl is-active crio"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh "sudo systemctl is-active crio": exit status 1 (198.258893ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2284: (dbg) Run:  out/minikube-darwin-amd64 license
--- PASS: TestFunctional/parallel/License (0.55s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2252: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 version --short
--- PASS: TestFunctional/parallel/Version/short (0.12s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2266: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.44s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-283000 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.28.1
registry.k8s.io/kube-proxy:v1.28.1
registry.k8s.io/kube-controller-manager:v1.28.1
registry.k8s.io/kube-apiserver:v1.28.1
registry.k8s.io/etcd:3.5.9-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.10.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-283000
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-283000
docker.io/kubernetesui/metrics-scraper:<none>
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-283000 image ls --format short --alsologtostderr:
I0906 16:45:14.765345    2614 out.go:296] Setting OutFile to fd 1 ...
I0906 16:45:14.765612    2614 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:14.765619    2614 out.go:309] Setting ErrFile to fd 2...
I0906 16:45:14.765623    2614 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:14.765812    2614 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
I0906 16:45:14.766582    2614 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:14.766721    2614 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:14.767196    2614 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:14.767246    2614 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:14.775381    2614 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50592
I0906 16:45:14.775923    2614 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:14.776479    2614 main.go:141] libmachine: Using API Version  1
I0906 16:45:14.776492    2614 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:14.776775    2614 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:14.776951    2614 main.go:141] libmachine: (functional-283000) Calling .GetState
I0906 16:45:14.777057    2614 main.go:141] libmachine: (functional-283000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0906 16:45:14.777120    2614 main.go:141] libmachine: (functional-283000) DBG | hyperkit pid from json: 1847
I0906 16:45:14.778424    2614 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:14.778465    2614 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:14.786409    2614 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50594
I0906 16:45:14.786778    2614 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:14.787116    2614 main.go:141] libmachine: Using API Version  1
I0906 16:45:14.787128    2614 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:14.787345    2614 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:14.787451    2614 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:14.787598    2614 ssh_runner.go:195] Run: systemctl --version
I0906 16:45:14.787616    2614 main.go:141] libmachine: (functional-283000) Calling .GetSSHHostname
I0906 16:45:14.787703    2614 main.go:141] libmachine: (functional-283000) Calling .GetSSHPort
I0906 16:45:14.787808    2614 main.go:141] libmachine: (functional-283000) Calling .GetSSHKeyPath
I0906 16:45:14.787916    2614 main.go:141] libmachine: (functional-283000) Calling .GetSSHUsername
I0906 16:45:14.788008    2614 sshutil.go:53] new ssh client: &{IP:192.168.64.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/functional-283000/id_rsa Username:docker}
I0906 16:45:14.841202    2614 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0906 16:45:14.867510    2614 main.go:141] libmachine: Making call to close driver server
I0906 16:45:14.867518    2614 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:14.867669    2614 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:14.867672    2614 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:14.867680    2614 main.go:141] libmachine: Making call to close connection to plugin binary
I0906 16:45:14.867685    2614 main.go:141] libmachine: Making call to close driver server
I0906 16:45:14.867691    2614 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:14.867813    2614 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:14.867823    2614 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-283000 image ls --format table --alsologtostderr:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| registry.k8s.io/kube-scheduler              | v1.28.1           | b462ce0c8b1ff | 60.1MB |
| registry.k8s.io/etcd                        | 3.5.9-0           | 73deb9a3f7025 | 294MB  |
| gcr.io/k8s-minikube/busybox                 | latest            | beae173ccac6a | 1.24MB |
| registry.k8s.io/pause                       | 3.1               | da86e6ba6ca19 | 742kB  |
| registry.k8s.io/echoserver                  | 1.8               | 82e4c8a736a4f | 95.4MB |
| registry.k8s.io/pause                       | latest            | 350b164e7ae1d | 240kB  |
| docker.io/library/minikube-local-cache-test | functional-283000 | ed8224f04fcea | 30B    |
| docker.io/library/nginx                     | alpine            | 433dbc17191a7 | 42.6MB |
| registry.k8s.io/kube-apiserver              | v1.28.1           | 5c801295c21d0 | 126MB  |
| registry.k8s.io/kube-proxy                  | v1.28.1           | 6cdbabde3874e | 73.1MB |
| docker.io/library/nginx                     | latest            | eea7b3dcba7ee | 187MB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| registry.k8s.io/kube-controller-manager     | v1.28.1           | 821b3dfea27be | 122MB  |
| registry.k8s.io/coredns/coredns             | v1.10.1           | ead0a4a53df89 | 53.6MB |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
| docker.io/localhost/my-image                | functional-283000 | 6ce2e2525a1a5 | 1.24MB |
| docker.io/library/mysql                     | 5.7               | 92034fe9a41f4 | 581MB  |
| registry.k8s.io/pause                       | 3.9               | e6f1816883972 | 744kB  |
| docker.io/kubernetesui/metrics-scraper      | <none>            | 115053965e86b | 43.8MB |
| gcr.io/google-containers/addon-resizer      | functional-283000 | ffd4cfbbe753e | 32.9MB |
| registry.k8s.io/pause                       | 3.3               | 0184c1613d929 | 683kB  |
|---------------------------------------------|-------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-283000 image ls --format table --alsologtostderr:
I0906 16:45:17.749135    2640 out.go:296] Setting OutFile to fd 1 ...
I0906 16:45:17.749336    2640 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:17.749342    2640 out.go:309] Setting ErrFile to fd 2...
I0906 16:45:17.749346    2640 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:17.749534    2640 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
I0906 16:45:17.750146    2640 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:17.750256    2640 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:17.750590    2640 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:17.750660    2640 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:17.757688    2640 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50625
I0906 16:45:17.758087    2640 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:17.758509    2640 main.go:141] libmachine: Using API Version  1
I0906 16:45:17.758527    2640 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:17.758716    2640 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:17.758825    2640 main.go:141] libmachine: (functional-283000) Calling .GetState
I0906 16:45:17.758898    2640 main.go:141] libmachine: (functional-283000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0906 16:45:17.758965    2640 main.go:141] libmachine: (functional-283000) DBG | hyperkit pid from json: 1847
I0906 16:45:17.760266    2640 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:17.760294    2640 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:17.767194    2640 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50627
I0906 16:45:17.767524    2640 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:17.767847    2640 main.go:141] libmachine: Using API Version  1
I0906 16:45:17.767856    2640 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:17.768067    2640 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:17.768177    2640 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:17.768325    2640 ssh_runner.go:195] Run: systemctl --version
I0906 16:45:17.768344    2640 main.go:141] libmachine: (functional-283000) Calling .GetSSHHostname
I0906 16:45:17.768427    2640 main.go:141] libmachine: (functional-283000) Calling .GetSSHPort
I0906 16:45:17.768509    2640 main.go:141] libmachine: (functional-283000) Calling .GetSSHKeyPath
I0906 16:45:17.768594    2640 main.go:141] libmachine: (functional-283000) Calling .GetSSHUsername
I0906 16:45:17.768682    2640 sshutil.go:53] new ssh client: &{IP:192.168.64.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/functional-283000/id_rsa Username:docker}
I0906 16:45:17.816820    2640 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0906 16:45:17.845932    2640 main.go:141] libmachine: Making call to close driver server
I0906 16:45:17.845945    2640 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:17.846173    2640 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:17.846191    2640 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:17.846203    2640 main.go:141] libmachine: Making call to close connection to plugin binary
I0906 16:45:17.846212    2640 main.go:141] libmachine: Making call to close driver server
I0906 16:45:17.846222    2640 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:17.846369    2640 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:17.846379    2640 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:17.846394    2640 main.go:141] libmachine: Making call to close connection to plugin binary
2023/09/06 16:45:21 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-283000 image ls --format json --alsologtostderr:
[{"id":"beae173ccac6ad749f76713cf4440fe3d21d1043fe616dfbe30775815d1d0f6a","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:latest"],"size":"1240000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"742000"},{"id":"ed8224f04fcea55a44fae845265823d975987bee086fc100ad4bcedd68f8d46c","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-283000"],"size":"30"},{"id":"433dbc17191a7830a9db6454bcc23414ad36caecedab39d1e51d41083ab1d629","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"42600000"},{"id":"b462ce0c8b1ff16d466c6e8c9fcae54ec740fdeb73af6e637b77eea36246054a","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.28.1"],"size":"60100000"},{"id":"eea7b3dcba7ee47c0d16a60cc85d2b977d166be3960541991f3e6294d795ed24","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"187000000"},{"id":"92034fe9a41f4344b97f3fc88a8796248e2cfa9b934be583
79f3dbc150d07d9d","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"581000000"},{"id":"e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.9"],"size":"744000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"240000"},{"id":"6ce2e2525a1a59d352cbbbeafb51dbcb9afb8d2d83323548eb5289710befaa15","repoDigests":[],"repoTags":["docker.io/localhost/my-image:functional-283000"],"size":"1240000"},{"id":"6cdbabde3874e1eca92441870b0ddeaef0edb514c3b3e2a3d5ade845b500bba5","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.28.1"],"size":"73100000"},{"id":"73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.9-0"],"size":"294000000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"68300
0"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"95400000"},{"id":"5c801295c21d0de2947ad600b9388f090f0f7ff22add9d9d95be82fa12288f77","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.28.1"],"size":"126000000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"821b3dfea27be94a3834878bec6f36d332c83250be3e3c2a2e2233575ebc9bac","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.28.1"],"size":"122000000"},{"id":"ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.10.1"],"size":"53600000"},{"id":"115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"],"size":"43800000"},{"id":"6e38f40d628db
3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-283000"],"size":"32900000"}]
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-283000 image ls --format json --alsologtostderr:
I0906 16:45:17.597028    2636 out.go:296] Setting OutFile to fd 1 ...
I0906 16:45:17.597223    2636 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:17.597230    2636 out.go:309] Setting ErrFile to fd 2...
I0906 16:45:17.597234    2636 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:17.597417    2636 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
I0906 16:45:17.598054    2636 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:17.598150    2636 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:17.598491    2636 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:17.598533    2636 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:17.605660    2636 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50620
I0906 16:45:17.606064    2636 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:17.606491    2636 main.go:141] libmachine: Using API Version  1
I0906 16:45:17.606505    2636 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:17.606727    2636 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:17.606837    2636 main.go:141] libmachine: (functional-283000) Calling .GetState
I0906 16:45:17.606922    2636 main.go:141] libmachine: (functional-283000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0906 16:45:17.606991    2636 main.go:141] libmachine: (functional-283000) DBG | hyperkit pid from json: 1847
I0906 16:45:17.608259    2636 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:17.608281    2636 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:17.615355    2636 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50622
I0906 16:45:17.615695    2636 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:17.616028    2636 main.go:141] libmachine: Using API Version  1
I0906 16:45:17.616040    2636 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:17.616243    2636 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:17.616348    2636 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:17.616496    2636 ssh_runner.go:195] Run: systemctl --version
I0906 16:45:17.616515    2636 main.go:141] libmachine: (functional-283000) Calling .GetSSHHostname
I0906 16:45:17.616597    2636 main.go:141] libmachine: (functional-283000) Calling .GetSSHPort
I0906 16:45:17.616718    2636 main.go:141] libmachine: (functional-283000) Calling .GetSSHKeyPath
I0906 16:45:17.616802    2636 main.go:141] libmachine: (functional-283000) Calling .GetSSHUsername
I0906 16:45:17.616885    2636 sshutil.go:53] new ssh client: &{IP:192.168.64.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/functional-283000/id_rsa Username:docker}
I0906 16:45:17.659017    2636 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0906 16:45:17.681766    2636 main.go:141] libmachine: Making call to close driver server
I0906 16:45:17.681776    2636 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:17.682023    2636 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:17.682031    2636 main.go:141] libmachine: Making call to close connection to plugin binary
I0906 16:45:17.682039    2636 main.go:141] libmachine: Making call to close driver server
I0906 16:45:17.682046    2636 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:17.682240    2636 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:17.682251    2636 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:17.682261    2636 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-283000 image ls --format yaml --alsologtostderr:
- id: 92034fe9a41f4344b97f3fc88a8796248e2cfa9b934be58379f3dbc150d07d9d
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "581000000"
- id: 73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.9-0
size: "294000000"
- id: e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.9
size: "744000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "240000"
- id: 5c801295c21d0de2947ad600b9388f090f0f7ff22add9d9d95be82fa12288f77
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.28.1
size: "126000000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 821b3dfea27be94a3834878bec6f36d332c83250be3e3c2a2e2233575ebc9bac
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.28.1
size: "122000000"
- id: b462ce0c8b1ff16d466c6e8c9fcae54ec740fdeb73af6e637b77eea36246054a
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.28.1
size: "60100000"
- id: eea7b3dcba7ee47c0d16a60cc85d2b977d166be3960541991f3e6294d795ed24
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "187000000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-283000
size: "32900000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "742000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- registry.k8s.io/echoserver:1.8
size: "95400000"
- id: ed8224f04fcea55a44fae845265823d975987bee086fc100ad4bcedd68f8d46c
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-283000
size: "30"
- id: 6cdbabde3874e1eca92441870b0ddeaef0edb514c3b3e2a3d5ade845b500bba5
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.28.1
size: "73100000"
- id: ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.10.1
size: "53600000"
- id: 115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7
repoDigests: []
repoTags:
- docker.io/kubernetesui/metrics-scraper:<none>
size: "43800000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "683000"
- id: 433dbc17191a7830a9db6454bcc23414ad36caecedab39d1e51d41083ab1d629
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "42600000"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-283000 image ls --format yaml --alsologtostderr:
I0906 16:45:14.937212    2619 out.go:296] Setting OutFile to fd 1 ...
I0906 16:45:14.937439    2619 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:14.937447    2619 out.go:309] Setting ErrFile to fd 2...
I0906 16:45:14.937451    2619 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:14.937641    2619 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
I0906 16:45:14.938269    2619 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:14.938368    2619 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:14.938729    2619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:14.938781    2619 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:14.946267    2619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50597
I0906 16:45:14.946697    2619 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:14.947155    2619 main.go:141] libmachine: Using API Version  1
I0906 16:45:14.947166    2619 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:14.947432    2619 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:14.947552    2619 main.go:141] libmachine: (functional-283000) Calling .GetState
I0906 16:45:14.947683    2619 main.go:141] libmachine: (functional-283000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0906 16:45:14.947800    2619 main.go:141] libmachine: (functional-283000) DBG | hyperkit pid from json: 1847
I0906 16:45:14.949277    2619 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:14.949297    2619 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:14.956682    2619 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50599
I0906 16:45:14.957094    2619 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:14.957510    2619 main.go:141] libmachine: Using API Version  1
I0906 16:45:14.957526    2619 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:14.957827    2619 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:14.957947    2619 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:14.958163    2619 ssh_runner.go:195] Run: systemctl --version
I0906 16:45:14.958217    2619 main.go:141] libmachine: (functional-283000) Calling .GetSSHHostname
I0906 16:45:14.958340    2619 main.go:141] libmachine: (functional-283000) Calling .GetSSHPort
I0906 16:45:14.958445    2619 main.go:141] libmachine: (functional-283000) Calling .GetSSHKeyPath
I0906 16:45:14.958536    2619 main.go:141] libmachine: (functional-283000) Calling .GetSSHUsername
I0906 16:45:14.958647    2619 sshutil.go:53] new ssh client: &{IP:192.168.64.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/functional-283000/id_rsa Username:docker}
I0906 16:45:14.999979    2619 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0906 16:45:15.040719    2619 main.go:141] libmachine: Making call to close driver server
I0906 16:45:15.040729    2619 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:15.040887    2619 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:15.040895    2619 main.go:141] libmachine: Making call to close connection to plugin binary
I0906 16:45:15.040901    2619 main.go:141] libmachine: Making call to close driver server
I0906 16:45:15.040906    2619 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:15.041041    2619 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:15.041068    2619 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:15.041080    2619 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (2.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh pgrep buildkitd: exit status 1 (128.871611ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image build -t localhost/my-image:functional-283000 testdata/build --alsologtostderr
functional_test.go:314: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image build -t localhost/my-image:functional-283000 testdata/build --alsologtostderr: (2.206489317s)
functional_test.go:319: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-283000 image build -t localhost/my-image:functional-283000 testdata/build --alsologtostderr:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in b450d9db3d3c
Removing intermediate container b450d9db3d3c
---> fa43a97ecf5b
Step 3/3 : ADD content.txt /
---> 6ce2e2525a1a
Successfully built 6ce2e2525a1a
Successfully tagged localhost/my-image:functional-283000
functional_test.go:322: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-283000 image build -t localhost/my-image:functional-283000 testdata/build --alsologtostderr:
I0906 16:45:15.245716    2628 out.go:296] Setting OutFile to fd 1 ...
I0906 16:45:15.246069    2628 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:15.246076    2628 out.go:309] Setting ErrFile to fd 2...
I0906 16:45:15.246080    2628 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0906 16:45:15.246267    2628 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
I0906 16:45:15.246858    2628 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:15.247455    2628 config.go:182] Loaded profile config "functional-283000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
I0906 16:45:15.247828    2628 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:15.247869    2628 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:15.255275    2628 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50609
I0906 16:45:15.255791    2628 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:15.256383    2628 main.go:141] libmachine: Using API Version  1
I0906 16:45:15.256397    2628 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:15.256639    2628 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:15.256758    2628 main.go:141] libmachine: (functional-283000) Calling .GetState
I0906 16:45:15.256849    2628 main.go:141] libmachine: (functional-283000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0906 16:45:15.256928    2628 main.go:141] libmachine: (functional-283000) DBG | hyperkit pid from json: 1847
I0906 16:45:15.258326    2628 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0906 16:45:15.258373    2628 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0906 16:45:15.265928    2628 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:50611
I0906 16:45:15.266283    2628 main.go:141] libmachine: () Calling .GetVersion
I0906 16:45:15.266722    2628 main.go:141] libmachine: Using API Version  1
I0906 16:45:15.266740    2628 main.go:141] libmachine: () Calling .SetConfigRaw
I0906 16:45:15.267005    2628 main.go:141] libmachine: () Calling .GetMachineName
I0906 16:45:15.267123    2628 main.go:141] libmachine: (functional-283000) Calling .DriverName
I0906 16:45:15.267279    2628 ssh_runner.go:195] Run: systemctl --version
I0906 16:45:15.267299    2628 main.go:141] libmachine: (functional-283000) Calling .GetSSHHostname
I0906 16:45:15.267402    2628 main.go:141] libmachine: (functional-283000) Calling .GetSSHPort
I0906 16:45:15.267494    2628 main.go:141] libmachine: (functional-283000) Calling .GetSSHKeyPath
I0906 16:45:15.267597    2628 main.go:141] libmachine: (functional-283000) Calling .GetSSHUsername
I0906 16:45:15.267705    2628 sshutil.go:53] new ssh client: &{IP:192.168.64.4 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/functional-283000/id_rsa Username:docker}
I0906 16:45:15.307492    2628 build_images.go:151] Building image from path: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/build.53473794.tar
I0906 16:45:15.307576    2628 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0906 16:45:15.316570    2628 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.53473794.tar
I0906 16:45:15.321435    2628 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.53473794.tar: stat -c "%s %y" /var/lib/minikube/build/build.53473794.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.53473794.tar': No such file or directory
I0906 16:45:15.321481    2628 ssh_runner.go:362] scp /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/build.53473794.tar --> /var/lib/minikube/build/build.53473794.tar (3072 bytes)
I0906 16:45:15.347923    2628 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.53473794
I0906 16:45:15.357606    2628 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.53473794 -xf /var/lib/minikube/build/build.53473794.tar
I0906 16:45:15.366991    2628 docker.go:339] Building image: /var/lib/minikube/build/build.53473794
I0906 16:45:15.367112    2628 ssh_runner.go:195] Run: docker build -t localhost/my-image:functional-283000 /var/lib/minikube/build/build.53473794
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
Install the buildx component to build images with BuildKit:
https://docs.docker.com/go/buildx/

                                                
                                                
I0906 16:45:17.359205    2628 ssh_runner.go:235] Completed: docker build -t localhost/my-image:functional-283000 /var/lib/minikube/build/build.53473794: (1.992029019s)
I0906 16:45:17.359276    2628 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.53473794
I0906 16:45:17.366066    2628 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.53473794.tar
I0906 16:45:17.373573    2628 build_images.go:207] Built localhost/my-image:functional-283000 from /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/build.53473794.tar
I0906 16:45:17.373607    2628 build_images.go:123] succeeded building to: functional-283000
I0906 16:45:17.373611    2628 build_images.go:124] failed building to: 
I0906 16:45:17.373650    2628 main.go:141] libmachine: Making call to close driver server
I0906 16:45:17.373658    2628 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:17.373906    2628 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:17.373929    2628 main.go:141] libmachine: Making call to close connection to plugin binary
I0906 16:45:17.373936    2628 main.go:141] libmachine: Making call to close driver server
I0906 16:45:17.373970    2628 main.go:141] libmachine: (functional-283000) Calling .Close
I0906 16:45:17.373973    2628 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:17.374213    2628 main.go:141] libmachine: (functional-283000) DBG | Closing plugin on server side
I0906 16:45:17.374217    2628 main.go:141] libmachine: Successfully made call to close driver server
I0906 16:45:17.374226    2628 main.go:141] libmachine: Making call to close connection to plugin binary
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (2.49s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (2.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:341: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (2.214448496s)
functional_test.go:346: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-283000
--- PASS: TestFunctional/parallel/ImageCommands/Setup (2.30s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.73s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:495: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-283000 docker-env) && out/minikube-darwin-amd64 status -p functional-283000"
functional_test.go:518: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-283000 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.73s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image load --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr
functional_test.go:354: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image load --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr: (3.137043337s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.31s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image load --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr
functional_test.go:364: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image load --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr: (2.262271591s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.42s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (5.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:234: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (1.919398812s)
functional_test.go:239: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-283000
functional_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image load --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr
E0906 16:44:22.933172    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:44:22.939142    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:44:22.949353    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
functional_test.go:244: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image load --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr: (3.217684104s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls
E0906 16:44:22.970158    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:44:23.011156    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:44:23.091315    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (5.34s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image save gcr.io/google-containers/addon-resizer:functional-283000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
E0906 16:44:23.251618    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:44:23.571758    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:44:24.212114    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
functional_test.go:379: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image save gcr.io/google-containers/addon-resizer:functional-283000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.257428896s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.26s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.34s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image rm gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.34s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
E0906 16:44:25.492331    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
functional_test.go:408: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.154402726s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.30s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-283000
functional_test.go:423: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 image save --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr
functional_test.go:423: (dbg) Done: out/minikube-darwin-amd64 -p functional-283000 image save --daemon gcr.io/google-containers/addon-resizer:functional-283000 --alsologtostderr: (1.299549486s)
functional_test.go:428: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-283000
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.40s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1269: (dbg) Run:  out/minikube-darwin-amd64 profile lis
E0906 16:44:28.054432    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
functional_test.go:1274: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1309: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1314: Took "198.531558ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1323: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1328: Took "67.402286ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1360: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1365: Took "203.36162ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1373: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1378: Took "65.981904ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-283000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-283000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-283000 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to kill pid 2335: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-283000 tunnel --alsologtostderr] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-283000 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-283000 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:344: "nginx-svc" [107a737b-cbe0-4968-8cd9-b23f95f62e0f] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx-svc" [107a737b-cbe0-4968-8cd9-b23f95f62e0f] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 11.030584199s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.23s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-283000 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.107.249.30 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:319: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:327: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:351: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:359: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:424: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-darwin-amd64 -p functional-283000 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (8.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1438: (dbg) Run:  kubectl --context functional-283000 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1444: (dbg) Run:  kubectl --context functional-283000 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1449: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-d7447cc7f-jd68b" [260c628e-96b7-4170-afb8-d594a08fc4ac] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-d7447cc7f-jd68b" [260c628e-96b7-4170-afb8-d594a08fc4ac] Running
functional_test.go:1449: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 8.006852241s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (8.11s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.76s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1458: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.76s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.76s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1488: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 service list -o json
functional_test.go:1493: Took "758.004126ms" to run "out/minikube-darwin-amd64 -p functional-283000 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.76s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1508: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 service --namespace=default --https --url hello-node
functional_test.go:1521: found endpoint: https://192.168.64.4:31000
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.42s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1539: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1558: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 service hello-node --url
functional_test.go:1564: found endpoint for hello-node: http://192.168.64.4:31000
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.43s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdspecific-port808490591/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (118.779476ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdspecific-port808490591/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh "sudo umount -f /mount-9p": exit status 1 (118.101836ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-darwin-amd64 -p functional-283000 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdspecific-port808490591/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.70s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.95s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup675835569/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup675835569/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup675835569/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount1: exit status 1 (173.614821ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount1: exit status 1 (166.171279ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-283000 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-darwin-amd64 mount -p functional-283000 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup675835569/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup675835569/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-283000 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup675835569/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.95s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.13s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-283000
--- PASS: TestFunctional/delete_addon-resizer_images (0.13s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-283000
--- PASS: TestFunctional/delete_my-image_image (0.05s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.05s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-283000
--- PASS: TestFunctional/delete_minikube_cached_images (0.05s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (71.37s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-656000 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-656000 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m11.369361093s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (71.37s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (11.77s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons enable ingress --alsologtostderr -v=5
E0906 16:47:06.781646    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons enable ingress --alsologtostderr -v=5: (11.769366854s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (11.77s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.61s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.61s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (38.26s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:183: (dbg) Run:  kubectl --context ingress-addon-legacy-656000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:183: (dbg) Done: kubectl --context ingress-addon-legacy-656000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (12.161691505s)
addons_test.go:208: (dbg) Run:  kubectl --context ingress-addon-legacy-656000 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:221: (dbg) Run:  kubectl --context ingress-addon-legacy-656000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:226: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [b1b68d4f-d5bf-451b-ab71-43e4ae82914f] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [b1b68d4f-d5bf-451b-ab71-43e4ae82914f] Running
addons_test.go:226: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 9.010844565s
addons_test.go:238: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:262: (dbg) Run:  kubectl --context ingress-addon-legacy-656000 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:267: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 ip
addons_test.go:273: (dbg) Run:  nslookup hello-john.test 192.168.64.6
addons_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons disable ingress-dns --alsologtostderr -v=1: (8.850435226s)
addons_test.go:287: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons disable ingress --alsologtostderr -v=1
addons_test.go:287: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-656000 addons disable ingress --alsologtostderr -v=1: (7.268965224s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (38.26s)

                                                
                                    
x
+
TestJSONOutput/start/Command (51.39s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-157000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-157000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (51.389344473s)
--- PASS: TestJSONOutput/start/Command (51.39s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.45s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-157000 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.45s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.41s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-157000 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.41s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.16s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-157000 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-157000 --output=json --user=testUser: (8.16396674s)
--- PASS: TestJSONOutput/stop/Command (8.16s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.7s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-868000 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-868000 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (347.152661ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"bde2cdfc-870a-4ff7-a7a0-3b94d2cc3994","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-868000] minikube v1.31.2 on Darwin 13.5.1","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"e5bf42eb-50d9-440b-91f4-01f27c3c272b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=17174"}}
	{"specversion":"1.0","id":"95b053ff-faa0-409e-831a-79b3d7d2a404","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig"}}
	{"specversion":"1.0","id":"05484f97-ff4e-46ad-bef9-2f2744a5b55e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"9db40813-7520-4d82-ac0f-1b4551c2f881","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"16fd93b7-dbbb-4398-bfe7-f2da0126d69e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube"}}
	{"specversion":"1.0","id":"dcd7ac4d-214e-47ad-95c3-fe68f252854b","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"7e8224fc-6564-4556-a82a-466f84cd3ddb","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-868000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-868000
--- PASS: TestErrorJSONOutput (0.70s)

                                                
                                    
x
+
TestMainNoArgs (0.06s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.06s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (16.24s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-171000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
E0906 16:49:32.522452    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-171000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.238408642s)
--- PASS: TestMountStart/serial/StartWithMountFirst (16.24s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-171000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-171000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.28s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (16.19s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-191000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
E0906 16:49:50.621976    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:49:53.002811    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-191000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.190866302s)
--- PASS: TestMountStart/serial/StartWithMountSecond (16.19s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-191000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-191000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.28s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.41s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-171000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-171000 --alsologtostderr -v=5: (2.407198739s)
--- PASS: TestMountStart/serial/DeleteFirst (2.41s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-191000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-191000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.28s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.21s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-191000
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-191000: (2.21010367s)
--- PASS: TestMountStart/serial/Stop (2.21s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (16.61s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-191000
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-191000: (15.611291805s)
--- PASS: TestMountStart/serial/RestartStopped (16.61s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-191000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-191000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (97.2s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:85: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-118000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0906 16:50:33.964650    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:51:55.886129    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
multinode_test.go:85: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-118000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (1m36.985719938s)
multinode_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (97.20s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (4.45s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:481: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:486: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- rollout status deployment/busybox
multinode_test.go:486: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-118000 -- rollout status deployment/busybox: (2.908006915s)
multinode_test.go:493: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:516: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:524: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-j8g5w -- nslookup kubernetes.io
multinode_test.go:524: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-qkm9f -- nslookup kubernetes.io
multinode_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-j8g5w -- nslookup kubernetes.default
multinode_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-qkm9f -- nslookup kubernetes.default
multinode_test.go:542: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-j8g5w -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:542: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-qkm9f -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (4.45s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.8s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:552: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:560: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-j8g5w -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:571: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-j8g5w -- sh -c "ping -c 1 192.168.64.1"
multinode_test.go:560: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-qkm9f -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:571: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-118000 -- exec busybox-5bc68d56bd-qkm9f -- sh -c "ping -c 1 192.168.64.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.80s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (32.42s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-118000 -v 3 --alsologtostderr
E0906 16:52:15.087069    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.092231    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.103217    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.124795    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.166660    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.246866    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.408199    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:15.728734    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:16.369358    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:17.650517    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:20.211666    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:25.332601    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:52:35.573146    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
multinode_test.go:110: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-118000 -v 3 --alsologtostderr: (32.130196172s)
multinode_test.go:116: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (32.42s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.18s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.18s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (4.78s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp testdata/cp-test.txt multinode-118000:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestMultiNodeserialCopyFile2503040974/001/cp-test_multinode-118000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000:/home/docker/cp-test.txt multinode-118000-m02:/home/docker/cp-test_multinode-118000_multinode-118000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m02 "sudo cat /home/docker/cp-test_multinode-118000_multinode-118000-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000:/home/docker/cp-test.txt multinode-118000-m03:/home/docker/cp-test_multinode-118000_multinode-118000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m03 "sudo cat /home/docker/cp-test_multinode-118000_multinode-118000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp testdata/cp-test.txt multinode-118000-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000-m02:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestMultiNodeserialCopyFile2503040974/001/cp-test_multinode-118000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000-m02:/home/docker/cp-test.txt multinode-118000:/home/docker/cp-test_multinode-118000-m02_multinode-118000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000 "sudo cat /home/docker/cp-test_multinode-118000-m02_multinode-118000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000-m02:/home/docker/cp-test.txt multinode-118000-m03:/home/docker/cp-test_multinode-118000-m02_multinode-118000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m03 "sudo cat /home/docker/cp-test_multinode-118000-m02_multinode-118000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp testdata/cp-test.txt multinode-118000-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000-m03:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestMultiNodeserialCopyFile2503040974/001/cp-test_multinode-118000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000-m03:/home/docker/cp-test.txt multinode-118000:/home/docker/cp-test_multinode-118000-m03_multinode-118000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000 "sudo cat /home/docker/cp-test_multinode-118000-m03_multinode-118000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 cp multinode-118000-m03:/home/docker/cp-test.txt multinode-118000-m02:/home/docker/cp-test_multinode-118000-m03_multinode-118000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 ssh -n multinode-118000-m02 "sudo cat /home/docker/cp-test_multinode-118000-m03_multinode-118000-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (4.78s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.64s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:210: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 node stop m03
multinode_test.go:210: (dbg) Done: out/minikube-darwin-amd64 -p multinode-118000 node stop m03: (2.176583931s)
multinode_test.go:216: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status
multinode_test.go:216: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-118000 status: exit status 7 (232.358329ms)

                                                
                                                
-- stdout --
	multinode-118000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-118000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-118000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr
multinode_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr: exit status 7 (227.225965ms)

                                                
                                                
-- stdout --
	multinode-118000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-118000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-118000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 16:52:43.825403    3396 out.go:296] Setting OutFile to fd 1 ...
	I0906 16:52:43.825589    3396 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:52:43.825596    3396 out.go:309] Setting ErrFile to fd 2...
	I0906 16:52:43.825600    3396 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:52:43.825777    3396 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 16:52:43.825960    3396 out.go:303] Setting JSON to false
	I0906 16:52:43.825983    3396 mustload.go:65] Loading cluster: multinode-118000
	I0906 16:52:43.826037    3396 notify.go:220] Checking for updates...
	I0906 16:52:43.826273    3396 config.go:182] Loaded profile config "multinode-118000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 16:52:43.826287    3396 status.go:255] checking status of multinode-118000 ...
	I0906 16:52:43.826655    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:43.826709    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:43.833582    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51484
	I0906 16:52:43.833931    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:43.834382    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:43.834395    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:43.834584    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:43.834682    3396 main.go:141] libmachine: (multinode-118000) Calling .GetState
	I0906 16:52:43.834769    3396 main.go:141] libmachine: (multinode-118000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 16:52:43.834826    3396 main.go:141] libmachine: (multinode-118000) DBG | hyperkit pid from json: 3111
	I0906 16:52:43.836005    3396 status.go:330] multinode-118000 host status = "Running" (err=<nil>)
	I0906 16:52:43.836022    3396 host.go:66] Checking if "multinode-118000" exists ...
	I0906 16:52:43.836252    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:43.836273    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:43.843039    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51486
	I0906 16:52:43.843351    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:43.843703    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:43.843719    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:43.843949    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:43.844083    3396 main.go:141] libmachine: (multinode-118000) Calling .GetIP
	I0906 16:52:43.844157    3396 host.go:66] Checking if "multinode-118000" exists ...
	I0906 16:52:43.844401    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:43.844435    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:43.854975    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51488
	I0906 16:52:43.855400    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:43.855748    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:43.855759    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:43.855944    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:43.856045    3396 main.go:141] libmachine: (multinode-118000) Calling .DriverName
	I0906 16:52:43.856178    3396 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0906 16:52:43.856197    3396 main.go:141] libmachine: (multinode-118000) Calling .GetSSHHostname
	I0906 16:52:43.856277    3396 main.go:141] libmachine: (multinode-118000) Calling .GetSSHPort
	I0906 16:52:43.856355    3396 main.go:141] libmachine: (multinode-118000) Calling .GetSSHKeyPath
	I0906 16:52:43.856450    3396 main.go:141] libmachine: (multinode-118000) Calling .GetSSHUsername
	I0906 16:52:43.856530    3396 sshutil.go:53] new ssh client: &{IP:192.168.64.11 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/multinode-118000/id_rsa Username:docker}
	I0906 16:52:43.890766    3396 ssh_runner.go:195] Run: systemctl --version
	I0906 16:52:43.894351    3396 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 16:52:43.903282    3396 kubeconfig.go:92] found "multinode-118000" server: "https://192.168.64.11:8443"
	I0906 16:52:43.903321    3396 api_server.go:166] Checking apiserver status ...
	I0906 16:52:43.903357    3396 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 16:52:43.911711    3396 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1905/cgroup
	I0906 16:52:43.917849    3396 api_server.go:182] apiserver freezer: "2:freezer:/kubepods/burstable/pod147592c7d7e95e3d9b61b36c4f0c377c/7d058d2c8fbef9b141826a7568560cc70f0be223e77ccb9e7356697e062a17d4"
	I0906 16:52:43.917908    3396 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods/burstable/pod147592c7d7e95e3d9b61b36c4f0c377c/7d058d2c8fbef9b141826a7568560cc70f0be223e77ccb9e7356697e062a17d4/freezer.state
	I0906 16:52:43.924222    3396 api_server.go:204] freezer state: "THAWED"
	I0906 16:52:43.924240    3396 api_server.go:253] Checking apiserver healthz at https://192.168.64.11:8443/healthz ...
	I0906 16:52:43.928763    3396 api_server.go:279] https://192.168.64.11:8443/healthz returned 200:
	ok
	I0906 16:52:43.928776    3396 status.go:421] multinode-118000 apiserver status = Running (err=<nil>)
	I0906 16:52:43.928782    3396 status.go:257] multinode-118000 status: &{Name:multinode-118000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0906 16:52:43.928793    3396 status.go:255] checking status of multinode-118000-m02 ...
	I0906 16:52:43.929077    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:43.929098    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:43.936383    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51492
	I0906 16:52:43.936740    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:43.937076    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:43.937085    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:43.937333    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:43.937465    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .GetState
	I0906 16:52:43.937552    3396 main.go:141] libmachine: (multinode-118000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 16:52:43.937613    3396 main.go:141] libmachine: (multinode-118000-m02) DBG | hyperkit pid from json: 3125
	I0906 16:52:43.938748    3396 status.go:330] multinode-118000-m02 host status = "Running" (err=<nil>)
	I0906 16:52:43.938757    3396 host.go:66] Checking if "multinode-118000-m02" exists ...
	I0906 16:52:43.939016    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:43.939037    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:43.945943    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51494
	I0906 16:52:43.946277    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:43.946620    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:43.946634    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:43.946848    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:43.946955    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .GetIP
	I0906 16:52:43.947036    3396 host.go:66] Checking if "multinode-118000-m02" exists ...
	I0906 16:52:43.947301    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:43.947327    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:43.954226    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51496
	I0906 16:52:43.954550    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:43.954883    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:43.954899    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:43.955080    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:43.955171    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .DriverName
	I0906 16:52:43.955293    3396 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0906 16:52:43.955305    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .GetSSHHostname
	I0906 16:52:43.955378    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .GetSSHPort
	I0906 16:52:43.955453    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .GetSSHKeyPath
	I0906 16:52:43.955530    3396 main.go:141] libmachine: (multinode-118000-m02) Calling .GetSSHUsername
	I0906 16:52:43.955594    3396 sshutil.go:53] new ssh client: &{IP:192.168.64.12 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/17174-977/.minikube/machines/multinode-118000-m02/id_rsa Username:docker}
	I0906 16:52:43.993776    3396 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 16:52:44.002245    3396 status.go:257] multinode-118000-m02 status: &{Name:multinode-118000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0906 16:52:44.002261    3396 status.go:255] checking status of multinode-118000-m03 ...
	I0906 16:52:44.002516    3396 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:52:44.002539    3396 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:52:44.009470    3396 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51499
	I0906 16:52:44.009810    3396 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:52:44.010158    3396 main.go:141] libmachine: Using API Version  1
	I0906 16:52:44.010170    3396 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:52:44.010406    3396 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:52:44.010500    3396 main.go:141] libmachine: (multinode-118000-m03) Calling .GetState
	I0906 16:52:44.010578    3396 main.go:141] libmachine: (multinode-118000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 16:52:44.010642    3396 main.go:141] libmachine: (multinode-118000-m03) DBG | hyperkit pid from json: 3189
	I0906 16:52:44.011753    3396 main.go:141] libmachine: (multinode-118000-m03) DBG | hyperkit pid 3189 missing from process table
	I0906 16:52:44.011788    3396 status.go:330] multinode-118000-m03 host status = "Stopped" (err=<nil>)
	I0906 16:52:44.011799    3396 status.go:343] host is not running, skipping remaining checks
	I0906 16:52:44.011805    3396 status.go:257] multinode-118000-m03 status: &{Name:multinode-118000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.64s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (27.22s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 node start m03 --alsologtostderr
E0906 16:52:56.053408    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
multinode_test.go:254: (dbg) Done: out/minikube-darwin-amd64 -p multinode-118000 node start m03 --alsologtostderr: (26.883564618s)
multinode_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status
multinode_test.go:275: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (27.22s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (198.31s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:283: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-118000
multinode_test.go:290: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-118000
multinode_test.go:290: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-118000: (18.392228156s)
multinode_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-118000 --wait=true -v=8 --alsologtostderr
E0906 16:53:37.013584    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:54:12.011736    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:54:22.937350    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 16:54:39.726791    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 16:54:58.935744    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
multinode_test.go:295: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-118000 --wait=true -v=8 --alsologtostderr: (2m59.831976521s)
multinode_test.go:300: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-118000
--- PASS: TestMultiNode/serial/RestartKeepsNodes (198.31s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (2.98s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:394: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 node delete m03
multinode_test.go:394: (dbg) Done: out/minikube-darwin-amd64 -p multinode-118000 node delete m03: (2.662887467s)
multinode_test.go:400: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr
multinode_test.go:424: (dbg) Run:  kubectl get nodes
multinode_test.go:432: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (2.98s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (16.43s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 stop
multinode_test.go:314: (dbg) Done: out/minikube-darwin-amd64 -p multinode-118000 stop: (16.307928968s)
multinode_test.go:320: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status
multinode_test.go:320: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-118000 status: exit status 7 (61.838576ms)

                                                
                                                
-- stdout --
	multinode-118000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-118000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:327: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr
multinode_test.go:327: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr: exit status 7 (60.01037ms)

                                                
                                                
-- stdout --
	multinode-118000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-118000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 16:56:48.935025    3493 out.go:296] Setting OutFile to fd 1 ...
	I0906 16:56:48.935204    3493 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:56:48.935210    3493 out.go:309] Setting ErrFile to fd 2...
	I0906 16:56:48.935214    3493 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 16:56:48.935383    3493 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/17174-977/.minikube/bin
	I0906 16:56:48.935558    3493 out.go:303] Setting JSON to false
	I0906 16:56:48.935598    3493 mustload.go:65] Loading cluster: multinode-118000
	I0906 16:56:48.935647    3493 notify.go:220] Checking for updates...
	I0906 16:56:48.935911    3493 config.go:182] Loaded profile config "multinode-118000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.28.1
	I0906 16:56:48.935922    3493 status.go:255] checking status of multinode-118000 ...
	I0906 16:56:48.936272    3493 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:56:48.936318    3493 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:56:48.942989    3493 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51680
	I0906 16:56:48.943310    3493 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:56:48.943703    3493 main.go:141] libmachine: Using API Version  1
	I0906 16:56:48.943721    3493 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:56:48.943969    3493 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:56:48.944085    3493 main.go:141] libmachine: (multinode-118000) Calling .GetState
	I0906 16:56:48.944167    3493 main.go:141] libmachine: (multinode-118000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 16:56:48.944220    3493 main.go:141] libmachine: (multinode-118000) DBG | hyperkit pid from json: 3446
	I0906 16:56:48.945054    3493 main.go:141] libmachine: (multinode-118000) DBG | hyperkit pid 3446 missing from process table
	I0906 16:56:48.945076    3493 status.go:330] multinode-118000 host status = "Stopped" (err=<nil>)
	I0906 16:56:48.945083    3493 status.go:343] host is not running, skipping remaining checks
	I0906 16:56:48.945089    3493 status.go:257] multinode-118000 status: &{Name:multinode-118000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0906 16:56:48.945112    3493 status.go:255] checking status of multinode-118000-m02 ...
	I0906 16:56:48.945351    3493 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 16:56:48.945364    3493 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0906 16:56:48.952094    3493 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:51682
	I0906 16:56:48.952399    3493 main.go:141] libmachine: () Calling .GetVersion
	I0906 16:56:48.952702    3493 main.go:141] libmachine: Using API Version  1
	I0906 16:56:48.952716    3493 main.go:141] libmachine: () Calling .SetConfigRaw
	I0906 16:56:48.952897    3493 main.go:141] libmachine: () Calling .GetMachineName
	I0906 16:56:48.952989    3493 main.go:141] libmachine: (multinode-118000-m02) Calling .GetState
	I0906 16:56:48.953072    3493 main.go:141] libmachine: (multinode-118000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 16:56:48.953135    3493 main.go:141] libmachine: (multinode-118000-m02) DBG | hyperkit pid from json: 3454
	I0906 16:56:48.953958    3493 main.go:141] libmachine: (multinode-118000-m02) DBG | hyperkit pid 3454 missing from process table
	I0906 16:56:48.953977    3493 status.go:330] multinode-118000-m02 host status = "Stopped" (err=<nil>)
	I0906 16:56:48.953984    3493 status.go:343] host is not running, skipping remaining checks
	I0906 16:56:48.953989    3493 status.go:257] multinode-118000-m02 status: &{Name:multinode-118000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (16.43s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (99.52s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:354: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-118000 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0906 16:57:15.088787    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 16:57:42.776519    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
multinode_test.go:354: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-118000 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (1m39.207054246s)
multinode_test.go:360: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-118000 status --alsologtostderr
multinode_test.go:374: (dbg) Run:  kubectl get nodes
multinode_test.go:382: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (99.52s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (44.56s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-118000
multinode_test.go:452: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-118000-m02 --driver=hyperkit 
multinode_test.go:452: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-118000-m02 --driver=hyperkit : exit status 14 (421.691998ms)

                                                
                                                
-- stdout --
	* [multinode-118000-m02] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-118000-m02' is duplicated with machine name 'multinode-118000-m02' in profile 'multinode-118000'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:460: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-118000-m03 --driver=hyperkit 
multinode_test.go:460: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-118000-m03 --driver=hyperkit : (38.574051109s)
multinode_test.go:467: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-118000
multinode_test.go:467: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-118000: exit status 80 (253.467767ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-118000
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-118000-m03 already exists in multinode-118000-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-118000-m03
E0906 16:59:12.012629    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
multinode_test.go:472: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-118000-m03: (5.272981313s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (44.56s)

                                                
                                    
x
+
TestPreload (142.72s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-884000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-884000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (1m14.451130714s)
preload_test.go:52: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-884000 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-darwin-amd64 -p test-preload-884000 image pull gcr.io/k8s-minikube/busybox: (1.206847531s)
preload_test.go:58: (dbg) Run:  out/minikube-darwin-amd64 stop -p test-preload-884000
E0906 17:00:45.981319    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
preload_test.go:58: (dbg) Done: out/minikube-darwin-amd64 stop -p test-preload-884000: (8.229205839s)
preload_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-884000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit 
preload_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-884000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit : (53.395205342s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-884000 image list
helpers_test.go:175: Cleaning up "test-preload-884000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-884000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-884000: (5.267654517s)
--- PASS: TestPreload (142.72s)

                                                
                                    
x
+
TestScheduledStopUnix (105.7s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-202000 --memory=2048 --driver=hyperkit 
E0906 17:02:15.059320    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-202000 --memory=2048 --driver=hyperkit : (34.229240496s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-202000 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-202000 -n scheduled-stop-202000
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-202000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-202000 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-202000 -n scheduled-stop-202000
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-202000
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-202000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-202000
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-202000: exit status 7 (56.756165ms)

                                                
                                                
-- stdout --
	scheduled-stop-202000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-202000 -n scheduled-stop-202000
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-202000 -n scheduled-stop-202000: exit status 7 (51.427733ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-202000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-202000
--- PASS: TestScheduledStopUnix (105.70s)

                                                
                                    
x
+
TestSkaffold (107.8s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3196356901 version
skaffold_test.go:59: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3196356901 version: (1.052803774s)
skaffold_test.go:63: skaffold version: v2.7.0
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-233000 --memory=2600 --driver=hyperkit 
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-233000 --memory=2600 --driver=hyperkit : (34.363251527s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3196356901 run --minikube-profile skaffold-233000 --kube-context skaffold-233000 --status-check=true --port-forward=false --interactive=false
E0906 17:04:11.983584    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 17:04:22.908213    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
skaffold_test.go:105: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3196356901 run --minikube-profile skaffold-233000 --kube-context skaffold-233000 --status-check=true --port-forward=false --interactive=false: (56.166122036s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:344: "leeroy-app-5bf977855b-hhvtj" [3c959190-0e2b-4ee3-9b05-6a8cc7e34c7b] Running
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 5.013734357s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:344: "leeroy-web-7b77698d4f-t2mr6" [1256614c-da64-48b3-9b31-0e21e300806b] Running
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.007982581s
helpers_test.go:175: Cleaning up "skaffold-233000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-233000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-233000: (5.260719023s)
--- PASS: TestSkaffold (107.80s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (169.56s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:132: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.1869581476.exe start -p running-upgrade-454000 --memory=2200 --vm-driver=hyperkit 
E0906 17:08:38.106012    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
version_upgrade_test.go:132: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.1869581476.exe start -p running-upgrade-454000 --memory=2200 --vm-driver=hyperkit : (1m29.302696388s)
version_upgrade_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-454000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0906 17:09:11.979591    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 17:09:22.903746    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 17:10:04.470099    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:04.475981    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:04.486894    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:04.508339    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:04.548749    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:04.630102    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:04.792325    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:05.113399    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:05.753838    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:07.034072    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:09.594831    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:10:14.715904    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
version_upgrade_test.go:142: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-454000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (1m14.564197221s)
helpers_test.go:175: Cleaning up "running-upgrade-454000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-454000
E0906 17:10:24.957999    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-454000: (5.263083399s)
--- PASS: TestRunningBinaryUpgrade (169.56s)

                                                
                                    
x
+
TestKubernetesUpgrade (164.1s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:234: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:234: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (1m15.80873791s)
version_upgrade_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-339000
version_upgrade_test.go:239: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-339000: (8.258456809s)
version_upgrade_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-339000 status --format={{.Host}}
version_upgrade_test.go:244: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-339000 status --format={{.Host}}: exit status 7 (51.352135ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:246: status error: exit status 7 (may be ok)
version_upgrade_test.go:255: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.28.1 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:255: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.28.1 --alsologtostderr -v=1 --driver=hyperkit : (33.433149324s)
version_upgrade_test.go:260: (dbg) Run:  kubectl --context kubernetes-upgrade-339000 version --output=json
version_upgrade_test.go:279: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:281: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (418.067243ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-339000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.28.1 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-339000
	    minikube start -p kubernetes-upgrade-339000 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-3390002 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.28.1, by running:
	    
	    minikube start -p kubernetes-upgrade-339000 --kubernetes-version=v1.28.1
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:285: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:287: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.28.1 --alsologtostderr -v=1 --driver=hyperkit 
E0906 17:12:48.318194    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
version_upgrade_test.go:287: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-339000 --memory=2200 --kubernetes-version=v1.28.1 --alsologtostderr -v=1 --driver=hyperkit : (40.824349724s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-339000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-339000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-339000: (5.254741387s)
--- PASS: TestKubernetesUpgrade (164.10s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.58s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.31.2 on darwin
- MINIKUBE_LOCATION=17174
- KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current418168361/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current418168361/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current418168361/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current418168361/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.58s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.67s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.31.2 on darwin
- MINIKUBE_LOCATION=17174
- KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4264681259/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
E0906 17:05:35.057427    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4264681259/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4264681259/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4264681259/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.67s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.49s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
E0906 17:10:45.438025    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
--- PASS: TestStoppedBinaryUpgrade/Setup (0.49s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (154.18s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:195: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.223706652.exe start -p stopped-upgrade-890000 --memory=2200 --vm-driver=hyperkit 
E0906 17:11:26.397824    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
version_upgrade_test.go:195: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.223706652.exe start -p stopped-upgrade-890000 --memory=2200 --vm-driver=hyperkit : (1m26.07711791s)
version_upgrade_test.go:204: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.223706652.exe -p stopped-upgrade-890000 stop
E0906 17:12:15.051781    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
version_upgrade_test.go:204: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.223706652.exe -p stopped-upgrade-890000 stop: (8.074260605s)
version_upgrade_test.go:210: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-890000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:210: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-890000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (1m0.031849397s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (154.18s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.7s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-890000
version_upgrade_test.go:218: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-890000: (2.698649269s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.70s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.41s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-349000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-349000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (407.099521ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-349000] minikube v1.31.2 on Darwin 13.5.1
	  - MINIKUBE_LOCATION=17174
	  - KUBECONFIG=/Users/jenkins/minikube-integration/17174-977/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/17174-977/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.41s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (37.8s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-349000 --driver=hyperkit 
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-349000 --driver=hyperkit : (37.646090808s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-349000 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (37.80s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (59.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p auto-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit : (59.246435072s)
--- PASS: TestNetworkPlugins/group/auto/Start (59.25s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (7.69s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-349000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-349000 --no-kubernetes --driver=hyperkit : (5.055583463s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-349000 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-349000 status -o json: exit status 2 (137.56267ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-349000","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-349000
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-349000: (2.497607732s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (7.69s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (16.21s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-349000 --no-kubernetes --driver=hyperkit 
E0906 17:14:11.976423    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 17:14:22.901404    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-349000 --no-kubernetes --driver=hyperkit : (16.214649248s)
--- PASS: TestNoKubernetes/serial/Start (16.21s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-349000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-349000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (123.592771ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.51s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.51s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.25s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-349000
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-349000: (2.253911475s)
--- PASS: TestNoKubernetes/serial/Stop (2.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (10.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-nr5wp" [d1642db9-18e5-4c90-9b05-35b95b50915f] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-nr5wp" [d1642db9-18e5-4c90-9b05-35b95b50915f] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 10.006145408s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (10.23s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (15.49s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-349000 --driver=hyperkit 
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-349000 --driver=hyperkit : (15.492204075s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (15.49s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.11s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-349000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-349000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (123.477221ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (58.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit : (58.383208777s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (58.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (82.83s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit 
E0906 17:15:04.465980    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:15:32.157225    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p calico-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit : (1m22.833092034s)
--- PASS: TestNetworkPlugins/group/calico/Start (82.83s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:344: "kindnet-hldk6" [cc1b3989-7d8f-42d7-9067-735274c634c0] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 5.013035911s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (10.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-s476x" [7564495d-10f1-4843-a3b3-e821b2f4d8f4] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-s476x" [7564495d-10f1-4843-a3b3-e821b2f4d8f4] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 10.008416422s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (10.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (5.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:344: "calico-node-spcd7" [e1d08a5f-9bc9-4ecd-b58e-b64b7d228174] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 5.017156443s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (5.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (60.68s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (1m0.677146654s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (60.68s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (10.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-dd44g" [ec44bb86-1b54-4bf2-9e1b-dfd2ffac6631] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-dd44g" [ec44bb86-1b54-4bf2-9e1b-dfd2ffac6631] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 10.007752886s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (10.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (89.78s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p false-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit 
E0906 17:17:15.049036    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p false-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit : (1m29.78387162s)
--- PASS: TestNetworkPlugins/group/false/Start (89.78s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (10.27s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-qdmv4" [c2171395-651c-4667-988a-0dae694cf37e] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-qdmv4" [c2171395-651c-4667-988a-0dae694cf37e] Running
E0906 17:17:25.945803    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 10.009534851s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (10.27s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (49.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit : (49.096457189s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (49.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (11.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context false-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-cpwbn" [1ff07682-553f-4cbb-b77e-292e55d9a29c] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-cpwbn" [1ff07682-553f-4cbb-b77e-292e55d9a29c] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 11.00773419s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (11.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:175: (dbg) Run:  kubectl --context false-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:194: (dbg) Run:  kubectl --context false-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:264: (dbg) Run:  kubectl --context false-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/false/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (11.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-bcl2h" [60bae6ad-ae32-496b-9bfa-748e4d9296e5] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-bcl2h" [60bae6ad-ae32-496b-9bfa-748e4d9296e5] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 11.007766929s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (11.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (49.49s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit : (49.489194488s)
--- PASS: TestNetworkPlugins/group/bridge/Start (49.49s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (49.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit 
E0906 17:19:22.897987    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 17:19:27.997492    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.003107    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.013413    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.033813    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.074995    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.156591    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.316809    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:28.636933    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:29.278502    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:30.558714    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:33.118804    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:38.238856    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:19:48.479991    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-758000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit : (49.152571978s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (49.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (10.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-mr2sd" [5a820138-20c1-46bc-ab58-2d68daaedd7d] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-mr2sd" [5a820138-20c1-46bc-ab58-2d68daaedd7d] Running
E0906 17:20:04.462523    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 10.007962386s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (10.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-758000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (9.24s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kubenet-758000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-ssdg8" [a699c4d2-229a-43aa-b83f-aa7de3e5f950] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-ssdg8" [a699c4d2-229a-43aa-b83f-aa7de3e5f950] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 9.011983528s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (9.24s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kubenet-758000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kubenet-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kubenet-758000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kubenet/HairPin (0.10s)
E0906 17:33:36.710053    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:33:48.125860    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:33:55.492274    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:34:05.964832    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 17:34:11.991285    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 17:34:15.818273    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:34:22.915849    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 17:34:28.017819    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:34:54.596363    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:35:04.482194    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:35:10.601669    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:35:17.412549    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:35:45.629126    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:35:51.063416    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:36:18.312249    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:37:08.679355    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:37:15.063042    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 17:37:19.418429    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (131.54s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-077000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-077000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (2m11.535969416s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (131.54s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (8.3s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-077000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [5547aa6c-200d-458b-96b5-6b9518e43de3] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [5547aa6c-200d-458b-96b5-6b9518e43de3] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 8.016436527s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-077000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (8.30s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (8.27s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-343000 --alsologtostderr -v=3
E0906 17:22:39.884208    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:22:40.225542    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-343000 --alsologtostderr -v=3: (8.265522995s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (8.27s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.76s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-077000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-077000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.76s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (8.22s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-077000 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-077000 --alsologtostderr -v=3: (8.216458574s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (8.22s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.29s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000: exit status 7 (50.992215ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-343000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.29s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (61.15s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-343000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.28.1
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-343000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.28.1: (1m0.992205896s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-343000 -n no-preload-343000
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (61.15s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.28s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-077000 -n old-k8s-version-077000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-077000 -n old-k8s-version-077000: exit status 7 (50.716814ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-077000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.28s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (510.68s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-077000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0906 17:23:00.366336    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:23:21.982148    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:21.988315    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:21.999895    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:22.021176    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:22.061812    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:22.192778    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:22.353046    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:22.673319    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:23.314401    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:24.594780    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:27.154971    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:29.459048    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:23:32.275758    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:36.686845    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:36.692129    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:36.702767    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:36.722974    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:36.763071    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:36.843258    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:37.005140    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:37.327277    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:37.968588    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:39.249072    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:41.326759    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:23:41.810918    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:23:42.517302    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:23:46.932593    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-077000 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (8m30.528796974s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-077000 -n old-k8s-version-077000
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (510.68s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (15.01s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-7x82w" [a408911b-d092-4dd0-9799-d797437d3417] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0906 17:23:57.172742    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-7x82w" [a408911b-d092-4dd0-9799-d797437d3417] Running
E0906 17:24:02.145486    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:24:02.998692    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 15.013375686s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (15.01s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.08s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-7x82w" [a408911b-d092-4dd0-9799-d797437d3417] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.009392645s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-343000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.08s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p no-preload-343000 "sudo crictl images -o json"
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (1.79s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-343000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-343000 -n no-preload-343000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-343000 -n no-preload-343000: exit status 2 (145.310013ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-343000 -n no-preload-343000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-343000 -n no-preload-343000: exit status 2 (144.85204ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-343000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-343000 -n no-preload-343000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-343000 -n no-preload-343000
--- PASS: TestStartStop/group/no-preload/serial/Pause (1.79s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (49.12s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-459000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.1
E0906 17:24:17.653543    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:24:22.893461    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 17:24:27.993805    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:24:43.959994    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:24:54.573010    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:54.578332    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:54.588652    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:54.608999    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:54.649546    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:54.731623    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:54.891761    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:55.212394    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:55.680223    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:24:55.852855    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:57.133324    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:24:58.614100    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:24:59.693396    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:25:03.246133    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:25:04.459515    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:25:04.814765    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-459000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.1: (49.124371327s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (49.12s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (8.27s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-459000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [d2565bcd-1e58-4d30-89de-f68dc32f58ad] Pending
helpers_test.go:344: "busybox" [d2565bcd-1e58-4d30-89de-f68dc32f58ad] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [d2565bcd-1e58-4d30-89de-f68dc32f58ad] Running
E0906 17:25:10.579682    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:10.585805    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:10.596657    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:10.617314    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:10.658547    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:10.738787    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:10.899635    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:11.221758    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:11.863044    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:13.144830    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 8.017637541s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-459000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (8.27s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.85s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-459000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-459000 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.85s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (8.25s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-459000 --alsologtostderr -v=3
E0906 17:25:15.054806    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:25:15.706407    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:18.094345    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 17:25:20.827385    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-459000 --alsologtostderr -v=3: (8.250918222s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (8.25s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.29s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-459000 -n embed-certs-459000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-459000 -n embed-certs-459000: exit status 7 (51.453715ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-459000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.29s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (296.92s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-459000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.1
E0906 17:25:31.067644    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:25:35.536077    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:25:45.606589    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:25:51.547648    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:26:05.890830    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:26:13.317465    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:26:16.519759    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:26:18.314667    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:26:20.559093    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:26:27.537968    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:26:32.537100    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:26:46.015514    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
E0906 17:27:15.074847    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
E0906 17:27:19.429240    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:27:38.450753    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:27:47.117608    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
E0906 17:27:54.459611    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:28:22.010164    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:28:36.716317    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:28:48.131902    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.137136    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.147666    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.167999    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.208664    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.289040    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.449557    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:48.771693    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:49.413534    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:49.749923    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/false-758000/client.crt: no such file or directory
E0906 17:28:50.694315    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:53.256466    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:28:58.378617    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:29:04.406822    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/enable-default-cni-758000/client.crt: no such file or directory
E0906 17:29:08.618969    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:29:11.996334    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/functional-283000/client.crt: no such file or directory
E0906 17:29:22.921934    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/addons-720000/client.crt: no such file or directory
E0906 17:29:28.022109    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/auto-758000/client.crt: no such file or directory
E0906 17:29:29.100218    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:29:54.601026    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
E0906 17:30:04.486675    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/skaffold-233000/client.crt: no such file or directory
E0906 17:30:10.059869    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
E0906 17:30:10.606774    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-459000 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.28.1: (4m56.763054398s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-459000 -n embed-certs-459000
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (296.92s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (5.02s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-jgjsx" [768e8369-1bd3-4a7f-b84b-0c2996106fff] Running
E0906 17:30:22.289573    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/bridge-758000/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.014514758s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (5.02s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-jgjsx" [768e8369-1bd3-4a7f-b84b-0c2996106fff] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.008539363s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-459000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p embed-certs-459000 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (1.8s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p embed-certs-459000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-459000 -n embed-certs-459000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-459000 -n embed-certs-459000: exit status 2 (144.72849ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-459000 -n embed-certs-459000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-459000 -n embed-certs-459000: exit status 2 (145.640682ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p embed-certs-459000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-459000 -n embed-certs-459000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-459000 -n embed-certs-459000
--- PASS: TestStartStop/group/embed-certs/serial/Pause (1.80s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/FirstStart (88.18s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-258000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.1
E0906 17:30:38.296936    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kubenet-758000/client.crt: no such file or directory
E0906 17:30:45.635328    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/kindnet-758000/client.crt: no such file or directory
E0906 17:31:18.319229    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/calico-758000/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-diff-port-258000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.1: (1m28.17546504s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/FirstStart (88.18s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.02s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-84b68f675b-5c772" [e7e61b05-3dfa-4d78-9d11-6d334b0c4936] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.015301831s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.02s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-84b68f675b-5c772" [e7e61b05-3dfa-4d78-9d11-6d334b0c4936] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.008481085s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-077000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p old-k8s-version-077000 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.65s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-077000 --alsologtostderr -v=1
E0906 17:31:31.980705    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/no-preload-343000/client.crt: no such file or directory
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-077000 -n old-k8s-version-077000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-077000 -n old-k8s-version-077000: exit status 2 (136.067166ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-077000 -n old-k8s-version-077000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-077000 -n old-k8s-version-077000: exit status 2 (141.165843ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-077000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-077000 -n old-k8s-version-077000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-077000 -n old-k8s-version-077000
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.65s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (48.26s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-369000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.28.1
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-369000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.28.1: (48.256084732s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (48.26s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/DeployApp (8.27s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-258000 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [ccd0e8c8-1454-4434-977b-09b923e36c9b] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [ccd0e8c8-1454-4434-977b-09b923e36c9b] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: integration-test=busybox healthy within 8.021940283s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-258000 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/DeployApp (8.27s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (0.83s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-diff-port-258000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-diff-port-258000 describe deploy/metrics-server -n kube-system
E0906 17:32:15.068374    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/ingress-addon-legacy-656000/client.crt: no such file or directory
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (0.83s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Stop (8.25s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-diff-port-258000 --alsologtostderr -v=3
E0906 17:32:19.422858    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/custom-flannel-758000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-diff-port-258000 --alsologtostderr -v=3: (8.248910024s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Stop (8.25s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.31s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000: exit status 7 (51.469966ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-diff-port-258000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.31s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (296.64s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-diff-port-258000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.1
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-diff-port-258000 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.28.1: (4m56.492899002s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/SecondStart (296.64s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.91s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-369000 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.91s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (8.25s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-369000 --alsologtostderr -v=3
E0906 17:32:33.566384    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:33.572754    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:33.582896    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:33.602974    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:33.643860    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:33.725483    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:33.886046    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:34.207555    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:34.848285    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:36.128971    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-369000 --alsologtostderr -v=3: (8.24588222s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (8.25s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.29s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-369000 -n newest-cni-369000
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-369000 -n newest-cni-369000: exit status 7 (50.372332ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-369000 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.29s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (39.4s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-369000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.28.1
E0906 17:32:38.689191    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:43.810327    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:32:54.051577    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
E0906 17:33:14.531356    1438 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/17174-977/.minikube/profiles/old-k8s-version-077000/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-369000 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=hyperkit  --kubernetes-version=v1.28.1: (39.241536043s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-369000 -n newest-cni-369000
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (39.40s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p newest-cni-369000 "sudo crictl images -o json"
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (1.79s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p newest-cni-369000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-369000 -n newest-cni-369000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-369000 -n newest-cni-369000: exit status 2 (155.309368ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-369000 -n newest-cni-369000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-369000 -n newest-cni-369000: exit status 2 (154.06049ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p newest-cni-369000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-369000 -n newest-cni-369000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-369000 -n newest-cni-369000
--- PASS: TestStartStop/group/newest-cni/serial/Pause (1.79s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (5.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-5lczw" [56553118-b8f2-4de3-b637-34442b5ec40b] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.012802429s
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (5.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-5lczw" [56553118-b8f2-4de3-b637-34442b5ec40b] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.008656725s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-diff-port-258000 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p default-k8s-diff-port-258000 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Pause (1.8s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p default-k8s-diff-port-258000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000: exit status 2 (138.666219ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000: exit status 2 (139.568368ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p default-k8s-diff-port-258000 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-258000 -n default-k8s-diff-port-258000
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Pause (1.80s)

                                                
                                    

Test skip (19/300)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:117: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:136: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.1/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.1/cached-images
aaa_download_only_test.go:117: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.1/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.1/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.1/binaries
aaa_download_only_test.go:136: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.1/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:210: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:474: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with docker false darwin amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:296: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (5.51s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:522: 
----------------------- debugLogs start: cilium-758000 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-758000" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters: null
contexts: null
current-context: ""
kind: Config
preferences: {}
users: null

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-758000

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-758000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-758000"

                                                
                                                
----------------------- debugLogs end: cilium-758000 [took: 5.129454506s] --------------------------------
helpers_test.go:175: Cleaning up "cilium-758000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cilium-758000
--- SKIP: TestNetworkPlugins/group/cilium (5.51s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.37s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-656000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-656000
--- SKIP: TestStartStop/group/disable-driver-mounts (0.37s)

                                                
                                    
Copied to clipboard