Test Report: KVM_Linux_containerd 19283

                    
                      8d2418a61c606cc3028c5bf9242bf095ec458362:2024-07-17:35383
                    
                

Test fail (15/327)

x
+
TestMultiControlPlane/serial/StartCluster (98.73s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-amd64 start -p ha-333994 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd
ha_test.go:101: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p ha-333994 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd: exit status 80 (1m36.793951326s)

                                                
                                                
-- stdout --
	* [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on user configuration
	* Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	* Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	* Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	  - Generating certificates and keys ...
	  - Booting up control plane ...
	  - Configuring RBAC rules ...
	* Configuring CNI (Container Networking Interface) ...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: storage-provisioner, default-storageclass
	
	* Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	* Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	* Found network options:
	  - NO_PROXY=192.168.39.180
	* Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	  - env NO_PROXY=192.168.39.180
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%s %y" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%s %y" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0xc0009febe0 https:0xc0009fec30] Dir:false P
rogressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	* 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:103: failed to fresh-start ha (multi-control plane) cluster. args "out/minikube-linux-amd64 start -p ha-333994 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd" : exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/StartCluster FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/StartCluster]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.213385186s)
helpers_test.go:252: TestMultiControlPlane/serial/StartCluster logs: 
-- stdout --
	
	==> Audit <==
	|----------------|-------------------------------------------------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	|    Command     |                                     Args                                      |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|----------------|-------------------------------------------------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| image          | functional-142583 image ls                                                    | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	| ssh            | functional-142583 ssh findmnt                                                 | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | -T /mount1                                                                    |                   |         |         |                     |                     |
	| image          | functional-142583 image load                                                  | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | /home/jenkins/workspace/KVM_Linux_containerd_integration/echo-server-save.tar |                   |         |         |                     |                     |
	|                | --alsologtostderr                                                             |                   |         |         |                     |                     |
	| ssh            | functional-142583 ssh findmnt                                                 | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | -T /mount2                                                                    |                   |         |         |                     |                     |
	| ssh            | functional-142583 ssh findmnt                                                 | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | -T /mount3                                                                    |                   |         |         |                     |                     |
	| mount          | -p functional-142583                                                          | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC |                     |
	|                | --kill=true                                                                   |                   |         |         |                     |                     |
	| addons         | functional-142583 addons list                                                 | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	| addons         | functional-142583 addons list                                                 | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | -o json                                                                       |                   |         |         |                     |                     |
	| ssh            | functional-142583 ssh echo                                                    | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | hello                                                                         |                   |         |         |                     |                     |
	| image          | functional-142583 image ls                                                    | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	| ssh            | functional-142583 ssh cat                                                     | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | /etc/hostname                                                                 |                   |         |         |                     |                     |
	| image          | functional-142583 image save --daemon                                         | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:24 UTC | 17 Jul 24 17:24 UTC |
	|                | docker.io/kicbase/echo-server:functional-142583                               |                   |         |         |                     |                     |
	|                | --alsologtostderr                                                             |                   |         |         |                     |                     |
	| update-context | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | update-context                                                                |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2                                                        |                   |         |         |                     |                     |
	| update-context | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | update-context                                                                |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2                                                        |                   |         |         |                     |                     |
	| update-context | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | update-context                                                                |                   |         |         |                     |                     |
	|                | --alsologtostderr -v=2                                                        |                   |         |         |                     |                     |
	| service        | functional-142583 service                                                     | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | hello-node-connect --url                                                      |                   |         |         |                     |                     |
	| image          | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | image ls --format short                                                       |                   |         |         |                     |                     |
	|                | --alsologtostderr                                                             |                   |         |         |                     |                     |
	| image          | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | image ls --format yaml                                                        |                   |         |         |                     |                     |
	|                | --alsologtostderr                                                             |                   |         |         |                     |                     |
	| ssh            | functional-142583 ssh pgrep                                                   | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC |                     |
	|                | buildkitd                                                                     |                   |         |         |                     |                     |
	| image          | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | image ls --format json                                                        |                   |         |         |                     |                     |
	|                | --alsologtostderr                                                             |                   |         |         |                     |                     |
	| image          | functional-142583 image build -t                                              | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | localhost/my-image:functional-142583                                          |                   |         |         |                     |                     |
	|                | testdata/build --alsologtostderr                                              |                   |         |         |                     |                     |
	| image          | functional-142583                                                             | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	|                | image ls --format table                                                       |                   |         |         |                     |                     |
	|                | --alsologtostderr                                                             |                   |         |         |                     |                     |
	| image          | functional-142583 image ls                                                    | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	| delete         | -p functional-142583                                                          | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	| start          | -p ha-333994 --wait=true                                                      | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC |                     |
	|                | --memory=2200 --ha                                                            |                   |         |         |                     |                     |
	|                | -v=7 --alsologtostderr                                                        |                   |         |         |                     |                     |
	|                | --driver=kvm2                                                                 |                   |         |         |                     |                     |
	|                | --container-runtime=containerd                                                |                   |         |         |                     |                     |
	|----------------|-------------------------------------------------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	86b483ab22e1a       6e38f40d628db       27 seconds ago       Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       27 seconds ago       Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       27 seconds ago       Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       39 seconds ago       Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       44 seconds ago       Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       59 seconds ago       Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       About a minute ago   Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       About a minute ago   Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       About a minute ago   Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       About a minute ago   Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.069323091Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.069416728Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.092092406Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.092222348Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.092248869Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.092335207Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.111018825Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.111107906Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.111124103Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.111525114Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.203194655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-sh96r,Uid:40fe2cb3-25ad-4d21-a67c-16752d657439,Namespace:kube-system,Attempt:0,} returns sandbox id \"a55470f3593c58d278ff17cf8fd31c0bbba9c51036939baae2b698a9a530e069\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.209180903Z" level=info msg="CreateContainer within sandbox \"a55470f3593c58d278ff17cf8fd31c0bbba9c51036939baae2b698a9a530e069\" for container &ContainerMetadata{Name:coredns,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.224900705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-n4xtd,Uid:29a654a4-f52d-4594-b402-93061221e0e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.227613767Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for container &ContainerMetadata{Name:coredns,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.255811544Z" level=info msg="CreateContainer within sandbox \"a55470f3593c58d278ff17cf8fd31c0bbba9c51036939baae2b698a9a530e069\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.256711991Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.269282488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:storage-provisioner,Uid:123c311b-67ed-42b2-ad53-cc59077dfbe7,Namespace:kube-system,Attempt:0,} returns sandbox id \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:27:07 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:26:46 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:26:46 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:26:46 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:26:46 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (10 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     45s
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     45s
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         59s
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      46s
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         46s
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         59s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         45s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 44s                kube-proxy       
	  Normal  Starting                 66s                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  66s (x4 over 66s)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    66s (x4 over 66s)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     66s (x3 over 66s)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  66s                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 59s                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  59s                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    59s                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     59s                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  59s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           46s                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                29s                kubelet          Node ha-333994 status is now: NodeReady
	
	
	==> dmesg <==
	[Jul17 17:25] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.567184Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 switched to configuration voters=(808613133158692504)"}
	{"level":"info","ts":"2024-07-17T17:26:10.569058Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","added-peer-id":"b38c55c42a3b698","added-peer-peer-urls":["https://192.168.39.180:2380"]}
	{"level":"info","ts":"2024-07-17T17:26:10.569991Z","caller":"embed/etcd.go:726","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2024-07-17T17:26:10.574483Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"b38c55c42a3b698","initial-advertise-peer-urls":["https://192.168.39.180:2380"],"listen-peer-urls":["https://192.168.39.180:2380"],"advertise-client-urls":["https://192.168.39.180:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.39.180:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-07-17T17:26:10.574541Z","caller":"embed/etcd.go:857","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-07-17T17:26:10.574981Z","caller":"embed/etcd.go:597","msg":"serving peer traffic","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:26:10.5751Z","caller":"embed/etcd.go:569","msg":"cmux::serve","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	
	
	==> kernel <==
	 17:27:15 up 1 min,  0 users,  load average: 0.68, 0.29, 0.10
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:26:35.792111       1 main.go:110] connected to apiserver: https://10.96.0.1:443
	I0717 17:26:35.883368       1 main.go:140] hostIP = 192.168.39.180
	podIP = 192.168.39.180
	I0717 17:26:35.883668       1 main.go:149] setting mtu 1500 for CNI 
	I0717 17:26:35.883736       1 main.go:179] kindnetd IP family: "ipv4"
	I0717 17:26:35.883770       1 main.go:183] noMask IPv4 subnets: [10.244.0.0/16]
	I0717 17:26:36.593010       1 main.go:223] Error initializing nftables: could not run nftables command: /dev/stdin:1:1-37: Error: Could not process rule: Operation not supported
	add table inet kube-network-policies
	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
	, skipping network policies
	I0717 17:26:46.602201       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:26:46.602460       1 main.go:303] handling current node
	I0717 17:26:56.596540       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:26:56.596752       1 main.go:303] handling current node
	I0717 17:27:06.600804       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:27:06.600898       1 main.go:303] handling current node
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	I0717 17:26:12.626156       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0717 17:26:12.627422       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0717 17:26:12.627461       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0717 17:26:12.633544       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0717 17:26:12.633578       1 policy_source.go:224] refreshing policies
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:29.073403       1 shared_informer.go:320] Caches are synced for ephemeral
	I0717 17:26:29.073778       1 shared_informer.go:320] Caches are synced for PVC protection
	I0717 17:26:29.126092       1 shared_informer.go:320] Caches are synced for attach detach
	I0717 17:26:29.127955       1 shared_informer.go:320] Caches are synced for persistent volume
	I0717 17:26:29.172459       1 shared_informer.go:320] Caches are synced for cronjob
	I0717 17:26:29.227981       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:26:29.229561       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:26:29.645377       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:26:29.645518       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0717 17:26:29.676538       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:26:30.131742       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="401.168376ms"
	I0717 17:26:30.147417       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="15.609225ms"
	I0717 17:26:30.150595       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="72.178µs"
	I0717 17:26:30.156045       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="88.456µs"
	I0717 17:26:46.686080       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="1.287244ms"
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.605015    1321 topology_manager.go:215] "Topology Admit Handler" podUID="de0fd552-4dd9-4de0-9520-1427e282021b" podNamespace="kube-system" podName="kube-proxy-jlzt5"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.617045    1321 topology_manager.go:215] "Topology Admit Handler" podUID="9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8" podNamespace="kube-system" podName="kindnet-5zksq"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.680457    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/de0fd552-4dd9-4de0-9520-1427e282021b-xtables-lock\") pod \"kube-proxy-jlzt5\" (UID: \"de0fd552-4dd9-4de0-9520-1427e282021b\") " pod="kube-system/kube-proxy-jlzt5"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.680611    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8-xtables-lock\") pod \"kindnet-5zksq\" (UID: \"9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8\") " pod="kube-system/kindnet-5zksq"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.680692    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfqb\" (UniqueName: \"kubernetes.io/projected/9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8-kube-api-access-whfqb\") pod \"kindnet-5zksq\" (UID: \"9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8\") " pod="kube-system/kindnet-5zksq"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.680897    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/de0fd552-4dd9-4de0-9520-1427e282021b-kube-proxy\") pod \"kube-proxy-jlzt5\" (UID: \"de0fd552-4dd9-4de0-9520-1427e282021b\") " pod="kube-system/kube-proxy-jlzt5"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.681026    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de0fd552-4dd9-4de0-9520-1427e282021b-lib-modules\") pod \"kube-proxy-jlzt5\" (UID: \"de0fd552-4dd9-4de0-9520-1427e282021b\") " pod="kube-system/kube-proxy-jlzt5"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.681158    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtscf\" (UniqueName: \"kubernetes.io/projected/de0fd552-4dd9-4de0-9520-1427e282021b-kube-api-access-xtscf\") pod \"kube-proxy-jlzt5\" (UID: \"de0fd552-4dd9-4de0-9520-1427e282021b\") " pod="kube-system/kube-proxy-jlzt5"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.681280    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-cfg\" (UniqueName: \"kubernetes.io/host-path/9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8-cni-cfg\") pod \"kindnet-5zksq\" (UID: \"9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8\") " pod="kube-system/kindnet-5zksq"
	Jul 17 17:26:29 ha-333994 kubelet[1321]: I0717 17:26:29.681398    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8-lib-modules\") pod \"kindnet-5zksq\" (UID: \"9b72ef3c-dcf4-4ec3-8087-00689ff2d2e8\") " pod="kube-system/kindnet-5zksq"
	Jul 17 17:26:36 ha-333994 kubelet[1321]: I0717 17:26:36.547674    1321 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jlzt5" podStartSLOduration=7.547648694 podStartE2EDuration="7.547648694s" podCreationTimestamp="2024-07-17 17:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-17 17:26:30.526621262 +0000 UTC m=+14.258990056" watchObservedRunningTime="2024-07-17 17:26:36.547648694 +0000 UTC m=+20.280017488"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.644940    1321 kubelet_node_status.go:497] "Fast updating node status as it just became ready"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.681890    1321 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kindnet-5zksq" podStartSLOduration=12.762634758 podStartE2EDuration="17.68181331s" podCreationTimestamp="2024-07-17 17:26:29 +0000 UTC" firstStartedPulling="2024-07-17 17:26:30.545986834 +0000 UTC m=+14.278355610" lastFinishedPulling="2024-07-17 17:26:35.465165387 +0000 UTC m=+19.197534162" observedRunningTime="2024-07-17 17:26:36.549949571 +0000 UTC m=+20.282318365" watchObservedRunningTime="2024-07-17 17:26:46.68181331 +0000 UTC m=+30.414182103"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.682086    1321 topology_manager.go:215] "Topology Admit Handler" podUID="40fe2cb3-25ad-4d21-a67c-16752d657439" podNamespace="kube-system" podName="coredns-7db6d8ff4d-sh96r"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.688079    1321 topology_manager.go:215] "Topology Admit Handler" podUID="123c311b-67ed-42b2-ad53-cc59077dfbe7" podNamespace="kube-system" podName="storage-provisioner"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.691343    1321 topology_manager.go:215] "Topology Admit Handler" podUID="29a654a4-f52d-4594-b402-93061221e0e1" podNamespace="kube-system" podName="coredns-7db6d8ff4d-n4xtd"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.800136    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40fe2cb3-25ad-4d21-a67c-16752d657439-config-volume\") pod \"coredns-7db6d8ff4d-sh96r\" (UID: \"40fe2cb3-25ad-4d21-a67c-16752d657439\") " pod="kube-system/coredns-7db6d8ff4d-sh96r"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.800186    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a654a4-f52d-4594-b402-93061221e0e1-config-volume\") pod \"coredns-7db6d8ff4d-n4xtd\" (UID: \"29a654a4-f52d-4594-b402-93061221e0e1\") " pod="kube-system/coredns-7db6d8ff4d-n4xtd"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.800207    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/123c311b-67ed-42b2-ad53-cc59077dfbe7-tmp\") pod \"storage-provisioner\" (UID: \"123c311b-67ed-42b2-ad53-cc59077dfbe7\") " pod="kube-system/storage-provisioner"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.800224    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2fc\" (UniqueName: \"kubernetes.io/projected/40fe2cb3-25ad-4d21-a67c-16752d657439-kube-api-access-kp2fc\") pod \"coredns-7db6d8ff4d-sh96r\" (UID: \"40fe2cb3-25ad-4d21-a67c-16752d657439\") " pod="kube-system/coredns-7db6d8ff4d-sh96r"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.800250    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88sv\" (UniqueName: \"kubernetes.io/projected/29a654a4-f52d-4594-b402-93061221e0e1-kube-api-access-d88sv\") pod \"coredns-7db6d8ff4d-n4xtd\" (UID: \"29a654a4-f52d-4594-b402-93061221e0e1\") " pod="kube-system/coredns-7db6d8ff4d-n4xtd"
	Jul 17 17:26:46 ha-333994 kubelet[1321]: I0717 17:26:46.800268    1321 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq9pr\" (UniqueName: \"kubernetes.io/projected/123c311b-67ed-42b2-ad53-cc59077dfbe7-kube-api-access-wq9pr\") pod \"storage-provisioner\" (UID: \"123c311b-67ed-42b2-ad53-cc59077dfbe7\") " pod="kube-system/storage-provisioner"
	Jul 17 17:26:47 ha-333994 kubelet[1321]: I0717 17:26:47.624955    1321 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-n4xtd" podStartSLOduration=17.6249316 podStartE2EDuration="17.6249316s" podCreationTimestamp="2024-07-17 17:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-17 17:26:47.590306238 +0000 UTC m=+31.322675033" watchObservedRunningTime="2024-07-17 17:26:47.6249316 +0000 UTC m=+31.357300406"
	Jul 17 17:26:47 ha-333994 kubelet[1321]: I0717 17:26:47.647670    1321 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/storage-provisioner" podStartSLOduration=17.647650055 podStartE2EDuration="17.647650055s" podCreationTimestamp="2024-07-17 17:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-17 17:26:47.625496174 +0000 UTC m=+31.357864970" watchObservedRunningTime="2024-07-17 17:26:47.647650055 +0000 UTC m=+31.380018850"
	Jul 17 17:26:48 ha-333994 kubelet[1321]: I0717 17:26:48.594167    1321 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-sh96r" podStartSLOduration=18.594150349 podStartE2EDuration="18.594150349s" podCreationTimestamp="2024-07-17 17:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-07-17 17:26:47.650892639 +0000 UTC m=+31.383261416" watchObservedRunningTime="2024-07-17 17:26:48.594150349 +0000 UTC m=+32.326519140"
	
	
	==> storage-provisioner [86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21] <==
	I0717 17:26:47.481175       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0717 17:26:47.495592       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0717 17:26:47.495817       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0717 17:26:47.507492       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0717 17:26:47.511210       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8!
	I0717 17:26:47.516960       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"9a33d6ef-207d-4ea5-bcad-ac569127b889", APIVersion:"v1", ResourceVersion:"447", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8 became leader
	I0717 17:26:47.611924       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMultiControlPlane/serial/StartCluster FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/StartCluster (98.73s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (683.64s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- rollout status deployment/busybox
E0717 17:27:52.134287   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:28:19.823562   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:29:41.797125   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:41.802382   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:41.812601   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:41.832891   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:41.873163   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:41.953468   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:42.113897   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:42.434600   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:43.075341   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:44.355829   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:46.916319   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:29:52.037347   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:30:02.277495   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:30:22.758368   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:31:03.719833   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:32:25.642905   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:32:52.134016   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:34:41.797446   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:35:09.486245   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
ha_test.go:133: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- rollout status deployment/busybox: exit status 1 (10m3.868005584s)

                                                
                                                
-- stdout --
	Waiting for deployment "busybox" rollout to finish: 0 of 3 updated replicas are available...
	Waiting for deployment "busybox" rollout to finish: 1 of 3 updated replicas are available...

                                                
                                                
-- /stdout --
** stderr ** 
	error: deployment "busybox" exceeded its progress deadline

                                                
                                                
** /stderr **
ha_test.go:135: failed to deploy busybox to ha (multi-control plane) cluster
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
E0717 17:37:52.133791   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:149: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:159: failed to resolve pod IPs: expected 3 Pod IPs but got 1 (may be temporary), output: "\n-- stdout --\n\t'10.244.0.4'\n\n-- /stdout --"
ha_test.go:163: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-5ngfp -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- nslookup kubernetes.io
ha_test.go:171: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- nslookup kubernetes.io: exit status 1 (107.323521ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-74lsp does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:173: Pod busybox-fc5497c4f-74lsp could not resolve 'kubernetes.io': exit status 1
ha_test.go:171: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- nslookup kubernetes.io
ha_test.go:171: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- nslookup kubernetes.io: exit status 1 (114.37227ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-djvz6 does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:173: Pod busybox-fc5497c4f-djvz6 could not resolve 'kubernetes.io': exit status 1
ha_test.go:181: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-5ngfp -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- nslookup kubernetes.default
ha_test.go:181: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- nslookup kubernetes.default: exit status 1 (106.655786ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-74lsp does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:183: Pod busybox-fc5497c4f-74lsp could not resolve 'kubernetes.default': exit status 1
ha_test.go:181: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- nslookup kubernetes.default
ha_test.go:181: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- nslookup kubernetes.default: exit status 1 (108.007026ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-djvz6 does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:183: Pod busybox-fc5497c4f-djvz6 could not resolve 'kubernetes.default': exit status 1
ha_test.go:189: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-5ngfp -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- nslookup kubernetes.default.svc.cluster.local: exit status 1 (105.878172ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-74lsp does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:191: Pod busybox-fc5497c4f-74lsp could not resolve local service (kubernetes.default.svc.cluster.local): exit status 1
ha_test.go:189: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- nslookup kubernetes.default.svc.cluster.local: exit status 1 (108.067757ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-djvz6 does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:191: Pod busybox-fc5497c4f-djvz6 could not resolve local service (kubernetes.default.svc.cluster.local): exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/DeployApp FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DeployApp]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.222497952s)
helpers_test.go:252: TestMultiControlPlane/serial/DeployApp logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |      Profile      |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	| image   | functional-142583 image ls           | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	| delete  | -p functional-142583                 | functional-142583 | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC | 17 Jul 24 17:25 UTC |
	| start   | -p ha-333994 --wait=true             | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:25 UTC |                     |
	|         | --memory=2200 --ha                   |                   |         |         |                     |                     |
	|         | -v=7 --alsologtostderr               |                   |         |         |                     |                     |
	|         | --driver=kvm2                        |                   |         |         |                     |                     |
	|         | --container-runtime=containerd       |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- apply -f             | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:27 UTC | 17 Jul 24 17:27 UTC |
	|         | ./testdata/ha/ha-pod-dns-test.yaml   |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- rollout status       | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:27 UTC |                     |
	|         | deployment/busybox                   |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.io               |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |                   |         |         |                     |                     |
	|         | nslookup kubernetes.default          |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994         | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |                   |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |                   |         |         |                     |                     |
	|---------|--------------------------------------|-------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       11 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       11 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       11 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       11 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       12 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       12 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       12 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       12 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       12 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       12 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       12 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:38:30 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         11m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     12m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     12m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         12m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      12m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 12m                kube-proxy       
	  Normal  Starting                 12m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  12m (x4 over 12m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    12m (x4 over 12m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     12m (x3 over 12m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  12m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 12m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  12m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    12m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     12m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  12m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           12m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                11m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.574483Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"b38c55c42a3b698","initial-advertise-peer-urls":["https://192.168.39.180:2380"],"listen-peer-urls":["https://192.168.39.180:2380"],"advertise-client-urls":["https://192.168.39.180:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.39.180:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-07-17T17:26:10.574541Z","caller":"embed/etcd.go:857","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-07-17T17:26:10.574981Z","caller":"embed/etcd.go:597","msg":"serving peer traffic","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:26:10.5751Z","caller":"embed/etcd.go:569","msg":"cmux::serve","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	
	
	==> kernel <==
	 17:38:38 up 12 min,  0 users,  load average: 0.29, 0.24, 0.14
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:36:36.593394       1 main.go:303] handling current node
	I0717 17:36:46.594438       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:36:46.594506       1 main.go:303] handling current node
	I0717 17:36:56.594272       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:36:56.594400       1 main.go:303] handling current node
	I0717 17:37:06.602697       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:06.602756       1 main.go:303] handling current node
	I0717 17:37:16.598595       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:16.598682       1 main.go:303] handling current node
	I0717 17:37:26.601183       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:26.601227       1 main.go:303] handling current node
	I0717 17:37:36.593264       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:36.593319       1 main.go:303] handling current node
	I0717 17:37:46.598363       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:46.598499       1 main.go:303] handling current node
	I0717 17:37:56.595211       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:56.595262       1 main.go:303] handling current node
	I0717 17:38:06.596913       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:06.597136       1 main.go:303] handling current node
	I0717 17:38:16.601764       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:16.601814       1 main.go:303] handling current node
	I0717 17:38:26.596776       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:26.596989       1 main.go:303] handling current node
	I0717 17:38:36.594167       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:36.594264       1 main.go:303] handling current node
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	I0717 17:26:12.633544       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0717 17:26:12.633578       1 policy_source.go:224] refreshing policies
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:29.229561       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:26:29.645377       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:26:29.645518       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0717 17:26:29.676538       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:26:30.131742       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="401.168376ms"
	I0717 17:26:30.147417       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="15.609225ms"
	I0717 17:26:30.150595       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="72.178µs"
	I0717 17:26:30.156045       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="88.456µs"
	I0717 17:26:46.686080       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="1.287244ms"
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:34:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:34:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:34:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:35:16 ha-333994 kubelet[1321]: E0717 17:35:16.468626    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:35:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:35:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:35:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:35:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:36:16 ha-333994 kubelet[1321]: E0717 17:36:16.469294    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	
	
	==> storage-provisioner [86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21] <==
	I0717 17:26:47.481175       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0717 17:26:47.495592       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0717 17:26:47.495817       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0717 17:26:47.507492       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0717 17:26:47.511210       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8!
	I0717 17:26:47.516960       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"9a33d6ef-207d-4ea5-bcad-ac569127b889", APIVersion:"v1", ResourceVersion:"447", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8 became leader
	I0717 17:26:47.611924       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-74lsp busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/DeployApp]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-74lsp busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-74lsp busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-74lsp
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-cz6xp (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-cz6xp:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                From               Message
	  ----     ------            ----               ----               -------
	  Warning  FailedScheduling  83s (x3 over 11m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	
	
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                From               Message
	  ----     ------            ----               ----               -------
	  Warning  FailedScheduling  83s (x3 over 11m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/DeployApp FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/DeployApp (683.64s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (2.56s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-5ngfp -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-5ngfp -- sh -c "ping -c 1 192.168.39.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:207: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-74lsp -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3": exit status 1 (106.910137ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-74lsp does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:209: Pod busybox-fc5497c4f-74lsp could not resolve 'host.minikube.internal': exit status 1
ha_test.go:207: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:207: (dbg) Non-zero exit: out/minikube-linux-amd64 kubectl -p ha-333994 -- exec busybox-fc5497c4f-djvz6 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3": exit status 1 (114.525841ms)

                                                
                                                
** stderr ** 
	Error from server (BadRequest): pod busybox-fc5497c4f-djvz6 does not have a host assigned

                                                
                                                
** /stderr **
ha_test.go:209: Pod busybox-fc5497c4f-djvz6 could not resolve 'host.minikube.internal': exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/PingHostFromPods FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/PingHostFromPods]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.1971566s)
helpers_test.go:252: TestMultiControlPlane/serial/PingHostFromPods logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       11 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       11 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       11 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       11 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       12 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       12 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       12 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       12 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       12 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       12 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       12 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:38:40 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         11m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     12m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     12m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         12m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      12m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         12m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 12m                kube-proxy       
	  Normal  Starting                 12m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  12m (x4 over 12m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    12m (x4 over 12m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     12m (x3 over 12m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  12m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 12m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  12m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    12m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     12m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  12m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           12m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                11m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.574483Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"b38c55c42a3b698","initial-advertise-peer-urls":["https://192.168.39.180:2380"],"listen-peer-urls":["https://192.168.39.180:2380"],"advertise-client-urls":["https://192.168.39.180:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.39.180:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-07-17T17:26:10.574541Z","caller":"embed/etcd.go:857","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-07-17T17:26:10.574981Z","caller":"embed/etcd.go:597","msg":"serving peer traffic","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:26:10.5751Z","caller":"embed/etcd.go:569","msg":"cmux::serve","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	
	
	==> kernel <==
	 17:38:41 up 13 min,  0 users,  load average: 0.27, 0.23, 0.13
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:36:36.593394       1 main.go:303] handling current node
	I0717 17:36:46.594438       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:36:46.594506       1 main.go:303] handling current node
	I0717 17:36:56.594272       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:36:56.594400       1 main.go:303] handling current node
	I0717 17:37:06.602697       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:06.602756       1 main.go:303] handling current node
	I0717 17:37:16.598595       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:16.598682       1 main.go:303] handling current node
	I0717 17:37:26.601183       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:26.601227       1 main.go:303] handling current node
	I0717 17:37:36.593264       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:36.593319       1 main.go:303] handling current node
	I0717 17:37:46.598363       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:46.598499       1 main.go:303] handling current node
	I0717 17:37:56.595211       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:37:56.595262       1 main.go:303] handling current node
	I0717 17:38:06.596913       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:06.597136       1 main.go:303] handling current node
	I0717 17:38:16.601764       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:16.601814       1 main.go:303] handling current node
	I0717 17:38:26.596776       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:26.596989       1 main.go:303] handling current node
	I0717 17:38:36.594167       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:38:36.594264       1 main.go:303] handling current node
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:29.229561       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:26:29.645377       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:26:29.645518       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0717 17:26:29.676538       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:26:30.131742       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="401.168376ms"
	I0717 17:26:30.147417       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="15.609225ms"
	I0717 17:26:30.150595       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="72.178µs"
	I0717 17:26:30.156045       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="88.456µs"
	I0717 17:26:46.686080       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="1.287244ms"
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:34:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:34:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:34:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:35:16 ha-333994 kubelet[1321]: E0717 17:35:16.468626    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:35:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:35:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:35:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:35:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:36:16 ha-333994 kubelet[1321]: E0717 17:36:16.469294    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	
	
	==> storage-provisioner [86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21] <==
	I0717 17:26:47.481175       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0717 17:26:47.495592       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0717 17:26:47.495817       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0717 17:26:47.507492       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0717 17:26:47.511210       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8!
	I0717 17:26:47.516960       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"9a33d6ef-207d-4ea5-bcad-ac569127b889", APIVersion:"v1", ResourceVersion:"447", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8 became leader
	I0717 17:26:47.611924       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_ha-333994_6bfaee24-69b3-4179-b0c0-9965e95a63d8!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-74lsp busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/PingHostFromPods]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-74lsp busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-74lsp busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-74lsp
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-cz6xp (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-cz6xp:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                From               Message
	  ----     ------            ----               ----               -------
	  Warning  FailedScheduling  86s (x3 over 11m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	
	
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                From               Message
	  ----     ------            ----               ----               -------
	  Warning  FailedScheduling  86s (x3 over 11m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/PingHostFromPods FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/PingHostFromPods (2.56s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (117.02s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-amd64 node add -p ha-333994 -v=7 --alsologtostderr
E0717 17:39:15.186068   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:39:41.796815   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
ha_test.go:228: (dbg) Done: out/minikube-linux-amd64 node add -p ha-333994 -v=7 --alsologtostderr: (1m54.572847208s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:234: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (571.157248ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:40:36.869210   36203 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:40:36.869319   36203 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:36.869326   36203 out.go:304] Setting ErrFile to fd 2...
	I0717 17:40:36.869331   36203 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:36.869521   36203 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:40:36.869671   36203 out.go:298] Setting JSON to false
	I0717 17:40:36.869699   36203 mustload.go:65] Loading cluster: ha-333994
	I0717 17:40:36.869752   36203 notify.go:220] Checking for updates...
	I0717 17:40:36.870033   36203 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:40:36.870046   36203 status.go:255] checking status of ha-333994 ...
	I0717 17:40:36.870440   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:36.870481   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:36.889233   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42521
	I0717 17:40:36.889648   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:36.890243   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:36.890268   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:36.890658   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:36.890865   36203 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:40:36.892314   36203 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:40:36.892331   36203 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:40:36.892620   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:36.892653   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:36.907136   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32819
	I0717 17:40:36.907567   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:36.908004   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:36.908025   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:36.908350   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:36.908511   36203 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:40:36.911227   36203 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:36.911615   36203 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:40:36.911634   36203 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:36.911781   36203 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:40:36.912066   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:36.912108   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:36.926721   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35331
	I0717 17:40:36.927119   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:36.927656   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:36.927680   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:36.927958   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:36.928137   36203 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:40:36.928324   36203 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:36.928344   36203 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:40:36.931132   36203 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:36.931521   36203 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:40:36.931543   36203 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:36.931670   36203 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:40:36.931846   36203 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:40:36.931997   36203 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:40:36.932172   36203 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:40:37.018100   36203 ssh_runner.go:195] Run: systemctl --version
	I0717 17:40:37.024376   36203 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:37.040245   36203 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:40:37.040271   36203 api_server.go:166] Checking apiserver status ...
	I0717 17:40:37.040300   36203 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:40:37.055099   36203 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:40:37.065282   36203 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:40:37.065326   36203 ssh_runner.go:195] Run: ls
	I0717 17:40:37.070620   36203 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:40:37.074619   36203 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:40:37.074638   36203 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:40:37.074648   36203 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:40:37.074662   36203 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:40:37.074953   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:37.074981   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:37.089550   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40723
	I0717 17:40:37.089956   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:37.090410   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:37.090427   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:37.090748   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:37.090935   36203 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:40:37.092485   36203 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:40:37.092501   36203 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:40:37.092761   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:37.092790   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:37.107909   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46883
	I0717 17:40:37.108264   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:37.108635   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:37.108654   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:37.108958   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:37.109143   36203 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:40:37.111856   36203 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:37.112190   36203 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:40:37.112209   36203 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:37.112395   36203 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:40:37.112769   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:37.112815   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:37.127431   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33801
	I0717 17:40:37.127808   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:37.128255   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:37.128272   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:37.128571   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:37.128739   36203 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:40:37.128909   36203 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:37.128928   36203 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:40:37.131443   36203 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:37.131830   36203 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:40:37.131859   36203 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:37.131916   36203 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:40:37.132080   36203 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:40:37.132195   36203 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:40:37.132300   36203 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:40:37.218242   36203 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:37.235390   36203 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:40:37.235414   36203 api_server.go:166] Checking apiserver status ...
	I0717 17:40:37.235443   36203 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:40:37.249230   36203 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:40:37.249249   36203 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:40:37.249257   36203 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:40:37.249274   36203 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:40:37.249613   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:37.249649   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:37.264847   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38903
	I0717 17:40:37.265264   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:37.265717   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:37.265737   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:37.266044   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:37.266236   36203 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:40:37.267861   36203 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:40:37.267877   36203 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:40:37.268156   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:37.268186   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:37.283447   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35705
	I0717 17:40:37.283876   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:37.284320   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:37.284340   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:37.284609   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:37.284793   36203 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:40:37.287375   36203 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:37.287784   36203 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:40:37.287809   36203 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:37.287925   36203 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:40:37.288205   36203 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:37.288235   36203 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:37.302213   36203 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42399
	I0717 17:40:37.302677   36203 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:37.303131   36203 main.go:141] libmachine: Using API Version  1
	I0717 17:40:37.303155   36203 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:37.303439   36203 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:37.303587   36203 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:40:37.303783   36203 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:37.303810   36203 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:40:37.306577   36203 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:37.306917   36203 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:40:37.306960   36203 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:37.307053   36203 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:40:37.307206   36203 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:40:37.307334   36203 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:40:37.307478   36203 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:40:37.385506   36203 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:37.400178   36203 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:236: failed to run minikube status. args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/AddWorkerNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/AddWorkerNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.171194071s)
helpers_test.go:252: TestMultiControlPlane/serial/AddWorkerNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       13 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       13 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       14 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       14 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       14 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       14 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       14 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       14 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       14 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:33 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 14m                kube-proxy       
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m (x3 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           14m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                13m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             node.kubernetes.io/not-ready:NoExecute
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:35 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      23s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         23s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 18s                kube-proxy       
	  Normal  NodeHasSufficientMemory  23s (x2 over 23s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    23s (x2 over 23s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     23s (x2 over 23s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  23s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           19s                node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                4s                 kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	
	
	==> kernel <==
	 17:40:38 up 14 min,  0 users,  load average: 0.35, 0.27, 0.16
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:39:16.601146       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:16.601222       1 main.go:303] handling current node
	I0717 17:39:26.600801       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:26.600899       1 main.go:303] handling current node
	I0717 17:39:36.593222       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:36.593331       1 main.go:303] handling current node
	I0717 17:39:46.601179       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:46.601359       1 main.go:303] handling current node
	I0717 17:39:56.594724       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:56.594776       1 main.go:303] handling current node
	I0717 17:40:06.602658       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:06.602795       1 main.go:303] handling current node
	I0717 17:40:16.593559       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:16.593631       1 main.go:303] handling current node
	I0717 17:40:16.593651       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:16.593660       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:16.594519       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.1.0/24 Src: <nil> Gw: 192.168.39.197 Flags: [] Table: 0} 
	I0717 17:40:26.593205       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:26.593326       1 main.go:303] handling current node
	I0717 17:40:26.593353       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:26.593491       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:36.593114       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:36.593470       1 main.go:303] handling current node
	I0717 17:40:36.593560       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:36.593643       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	Jul 17 17:39:16 ha-333994 kubelet[1321]: E0717 17:39:16.468909    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:40:16 ha-333994 kubelet[1321]: E0717 17:40:16.471379    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/AddWorkerNode]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  3m23s (x3 over 13m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  5s                   default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/AddWorkerNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/AddWorkerNode (117.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (2.22s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
ha_test.go:304: expected profile "ha-333994" in json of 'profile list' to include 4 nodes but have 3 nodes. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-333994\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-333994\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"kvm2\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":1,\"APIServerPor
t\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-333994\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.168.39.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"containerd\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.168.39.180\",\"Port\":8443,\"KubernetesVers
ion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.168.39.127\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.168.39.197\",\"Port\":0,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":
false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/home/jenkins:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\"
:false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-linux-amd64 profile list --output json"
ha_test.go:307: expected profile "ha-333994" in json of 'profile list' to have "HAppy" status but have "Stopped" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-333994\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-333994\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"kvm2\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":1,\
"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-333994\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.168.39.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"containerd\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.168.39.180\",\"Port\":8443,\"K
ubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.168.39.127\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.168.39.197\",\"Port\":0,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-dev
ice-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/home/jenkins:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOp
timizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-linux-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/HAppyAfterClusterStart FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/HAppyAfterClusterStart]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.173565233s)
helpers_test.go:252: TestMultiControlPlane/serial/HAppyAfterClusterStart logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       13 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       13 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       14 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       14 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       14 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       14 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       14 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       14 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       14 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:33 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 14m                kube-proxy       
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m (x3 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           14m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                13m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:35 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      25s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         25s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 21s                kube-proxy       
	  Normal  NodeHasSufficientMemory  25s (x2 over 25s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    25s (x2 over 25s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     25s (x2 over 25s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  25s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           21s                node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                6s                 kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	
	
	==> kernel <==
	 17:40:40 up 14 min,  0 users,  load average: 0.35, 0.27, 0.16
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:39:16.601146       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:16.601222       1 main.go:303] handling current node
	I0717 17:39:26.600801       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:26.600899       1 main.go:303] handling current node
	I0717 17:39:36.593222       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:36.593331       1 main.go:303] handling current node
	I0717 17:39:46.601179       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:46.601359       1 main.go:303] handling current node
	I0717 17:39:56.594724       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:56.594776       1 main.go:303] handling current node
	I0717 17:40:06.602658       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:06.602795       1 main.go:303] handling current node
	I0717 17:40:16.593559       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:16.593631       1 main.go:303] handling current node
	I0717 17:40:16.593651       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:16.593660       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:16.594519       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.1.0/24 Src: <nil> Gw: 192.168.39.197 Flags: [] Table: 0} 
	I0717 17:40:26.593205       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:26.593326       1 main.go:303] handling current node
	I0717 17:40:26.593353       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:26.593491       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:36.593114       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:36.593470       1 main.go:303] handling current node
	I0717 17:40:36.593560       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:36.593643       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	Jul 17 17:39:16 ha-333994 kubelet[1321]: E0717 17:39:16.468909    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:40:16 ha-333994 kubelet[1321]: E0717 17:40:16.471379    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/HAppyAfterClusterStart]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  3m25s (x3 over 13m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  7s                   default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/HAppyAfterClusterStart FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/HAppyAfterClusterStart (2.22s)

                                                
                                    
x
+
TestMultiControlPlane/serial/CopyFile (2.42s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:326: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status --output json -v=7 --alsologtostderr
ha_test.go:326: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status --output json -v=7 --alsologtostderr: exit status 2 (569.273446ms)

                                                
                                                
-- stdout --
	[{"Name":"ha-333994","Host":"Running","Kubelet":"Running","APIServer":"Running","Kubeconfig":"Configured","Worker":false},{"Name":"ha-333994-m02","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false},{"Name":"ha-333994-m03","Host":"Running","Kubelet":"Running","APIServer":"Irrelevant","Kubeconfig":"Irrelevant","Worker":true}]

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:40:41.602360   36550 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:40:41.602487   36550 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:41.602497   36550 out.go:304] Setting ErrFile to fd 2...
	I0717 17:40:41.602503   36550 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:41.602665   36550 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:40:41.602835   36550 out.go:298] Setting JSON to true
	I0717 17:40:41.602863   36550 mustload.go:65] Loading cluster: ha-333994
	I0717 17:40:41.602989   36550 notify.go:220] Checking for updates...
	I0717 17:40:41.603231   36550 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:40:41.603250   36550 status.go:255] checking status of ha-333994 ...
	I0717 17:40:41.603626   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.603686   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.621673   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34549
	I0717 17:40:41.622099   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.622763   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.622783   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.623146   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.623331   36550 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:40:41.624895   36550 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:40:41.624912   36550 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:40:41.625305   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.625346   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.640102   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33027
	I0717 17:40:41.640573   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.641049   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.641071   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.641394   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.641562   36550 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:40:41.644443   36550 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:41.644909   36550 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:40:41.644944   36550 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:41.645083   36550 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:40:41.645369   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.645409   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.660276   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41911
	I0717 17:40:41.660669   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.661106   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.661127   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.661393   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.661571   36550 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:40:41.661760   36550 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:41.661795   36550 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:40:41.664521   36550 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:41.664883   36550 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:40:41.664907   36550 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:41.665071   36550 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:40:41.665239   36550 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:40:41.665396   36550 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:40:41.665533   36550 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:40:41.749967   36550 ssh_runner.go:195] Run: systemctl --version
	I0717 17:40:41.755863   36550 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:41.773041   36550 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:40:41.773064   36550 api_server.go:166] Checking apiserver status ...
	I0717 17:40:41.773096   36550 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:40:41.787172   36550 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:40:41.797910   36550 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:40:41.797965   36550 ssh_runner.go:195] Run: ls
	I0717 17:40:41.802499   36550 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:40:41.806482   36550 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:40:41.806503   36550 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:40:41.806514   36550 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:40:41.806535   36550 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:40:41.806813   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.806852   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.821746   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36397
	I0717 17:40:41.822100   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.822558   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.822579   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.822865   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.823027   36550 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:40:41.824305   36550 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:40:41.824319   36550 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:40:41.824620   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.824649   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.838918   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34791
	I0717 17:40:41.839312   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.839819   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.839847   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.840147   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.840324   36550 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:40:41.843287   36550 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:41.843697   36550 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:40:41.843728   36550 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:41.843853   36550 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:40:41.844173   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.844213   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.858422   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42843
	I0717 17:40:41.858795   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.859186   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.859207   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.859499   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.859654   36550 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:40:41.859826   36550 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:41.859853   36550 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:40:41.862155   36550 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:41.862495   36550 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:40:41.862531   36550 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:41.862622   36550 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:40:41.862796   36550 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:40:41.862939   36550 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:40:41.863061   36550 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:40:41.949846   36550 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:41.965096   36550 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:40:41.965128   36550 api_server.go:166] Checking apiserver status ...
	I0717 17:40:41.965167   36550 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:40:41.978234   36550 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:40:41.978255   36550 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:40:41.978267   36550 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:40:41.978288   36550 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:40:41.978587   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.978629   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:41.994680   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35025
	I0717 17:40:41.995112   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:41.995557   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:41.995576   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:41.995828   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:41.996014   36550 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:40:41.997420   36550 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:40:41.997437   36550 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:40:41.997707   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:41.997743   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:42.012924   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40439
	I0717 17:40:42.013291   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:42.013718   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:42.013756   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:42.014051   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:42.014237   36550 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:40:42.016644   36550 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:42.017040   36550 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:40:42.017064   36550 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:42.017174   36550 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:40:42.017442   36550 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:42.017472   36550 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:42.031607   36550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36431
	I0717 17:40:42.031950   36550 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:42.032349   36550 main.go:141] libmachine: Using API Version  1
	I0717 17:40:42.032369   36550 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:42.032631   36550 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:42.032804   36550 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:40:42.032970   36550 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:42.032990   36550 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:40:42.035626   36550 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:42.036044   36550 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:40:42.036074   36550 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:42.036223   36550 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:40:42.036363   36550 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:40:42.036504   36550 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:40:42.036611   36550 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:40:42.113814   36550 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:42.128101   36550 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:328: failed to run minikube status. args "out/minikube-linux-amd64 -p ha-333994 status --output json -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/CopyFile FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/CopyFile]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.145167437s)
helpers_test.go:252: TestMultiControlPlane/serial/CopyFile logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       13 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       13 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       14 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       14 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       14 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       14 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       14 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       14 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       14 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:33 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 14m                kube-proxy       
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m (x3 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           14m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                13m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:35 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:40:34 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      28s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 23s                kube-proxy       
	  Normal  NodeHasSufficientMemory  28s (x2 over 28s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    28s (x2 over 28s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     28s (x2 over 28s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  28s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           24s                node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                9s                 kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	
	
	==> kernel <==
	 17:40:43 up 15 min,  0 users,  load average: 0.32, 0.26, 0.16
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:39:16.601146       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:16.601222       1 main.go:303] handling current node
	I0717 17:39:26.600801       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:26.600899       1 main.go:303] handling current node
	I0717 17:39:36.593222       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:36.593331       1 main.go:303] handling current node
	I0717 17:39:46.601179       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:46.601359       1 main.go:303] handling current node
	I0717 17:39:56.594724       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:56.594776       1 main.go:303] handling current node
	I0717 17:40:06.602658       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:06.602795       1 main.go:303] handling current node
	I0717 17:40:16.593559       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:16.593631       1 main.go:303] handling current node
	I0717 17:40:16.593651       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:16.593660       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:16.594519       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.1.0/24 Src: <nil> Gw: 192.168.39.197 Flags: [] Table: 0} 
	I0717 17:40:26.593205       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:26.593326       1 main.go:303] handling current node
	I0717 17:40:26.593353       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:26.593491       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:36.593114       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:36.593470       1 main.go:303] handling current node
	I0717 17:40:36.593560       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:36.593643       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	Jul 17 17:39:16 ha-333994 kubelet[1321]: E0717 17:39:16.468909    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:40:16 ha-333994 kubelet[1321]: E0717 17:40:16.471379    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/CopyFile]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  3m27s (x3 over 13m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  0s (x2 over 9s)      default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/CopyFile FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/CopyFile (2.42s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (3.61s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:363: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 node stop m02 -v=7 --alsologtostderr
ha_test.go:363: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 node stop m02 -v=7 --alsologtostderr: (1.270957229s)
ha_test.go:369: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:369: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 7 (408.864241ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:40:45.285865   36780 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:40:45.285957   36780 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:45.285964   36780 out.go:304] Setting ErrFile to fd 2...
	I0717 17:40:45.285969   36780 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:45.286176   36780 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:40:45.286316   36780 out.go:298] Setting JSON to false
	I0717 17:40:45.286340   36780 mustload.go:65] Loading cluster: ha-333994
	I0717 17:40:45.286448   36780 notify.go:220] Checking for updates...
	I0717 17:40:45.286682   36780 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:40:45.286695   36780 status.go:255] checking status of ha-333994 ...
	I0717 17:40:45.287098   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.287152   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.307461   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36399
	I0717 17:40:45.307906   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.308401   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.308423   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.308692   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.308883   36780 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:40:45.310456   36780 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:40:45.310471   36780 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:40:45.310772   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.310808   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.325587   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33057
	I0717 17:40:45.326029   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.326511   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.326534   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.326847   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.327029   36780 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:40:45.329606   36780 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:45.330035   36780 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:40:45.330068   36780 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:45.330177   36780 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:40:45.330554   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.330617   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.344997   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38945
	I0717 17:40:45.345363   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.345829   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.345844   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.346086   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.346293   36780 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:40:45.346532   36780 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:45.346557   36780 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:40:45.349129   36780 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:45.349554   36780 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:40:45.349586   36780 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:40:45.349743   36780 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:40:45.349889   36780 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:40:45.350044   36780 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:40:45.350166   36780 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:40:45.434155   36780 ssh_runner.go:195] Run: systemctl --version
	I0717 17:40:45.440161   36780 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:45.454780   36780 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:40:45.454813   36780 api_server.go:166] Checking apiserver status ...
	I0717 17:40:45.454858   36780 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:40:45.469470   36780 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:40:45.479842   36780 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:40:45.479907   36780 ssh_runner.go:195] Run: ls
	I0717 17:40:45.484170   36780 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:40:45.488105   36780 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:40:45.488128   36780 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:40:45.488137   36780 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:40:45.488153   36780 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:40:45.488435   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.488472   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.503060   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41117
	I0717 17:40:45.503493   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.503962   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.503984   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.504260   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.504442   36780 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:40:45.506063   36780 status.go:330] ha-333994-m02 host status = "Stopped" (err=<nil>)
	I0717 17:40:45.506077   36780 status.go:343] host is not running, skipping remaining checks
	I0717 17:40:45.506092   36780 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:40:45.506106   36780 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:40:45.506409   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.506445   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.521165   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34941
	I0717 17:40:45.521589   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.522044   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.522062   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.522343   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.522518   36780 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:40:45.523833   36780 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:40:45.523846   36780 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:40:45.524128   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.524157   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.538613   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36503
	I0717 17:40:45.538994   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.539398   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.539414   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.539704   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.539887   36780 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:40:45.542191   36780 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:45.542553   36780 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:40:45.542571   36780 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:45.542732   36780 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:40:45.543125   36780 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:45.543170   36780 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:45.557838   36780 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37009
	I0717 17:40:45.558304   36780 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:45.558720   36780 main.go:141] libmachine: Using API Version  1
	I0717 17:40:45.558749   36780 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:45.559029   36780 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:45.559168   36780 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:40:45.559337   36780 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:40:45.559358   36780 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:40:45.561662   36780 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:45.562012   36780 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:40:45.562038   36780 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:40:45.562153   36780 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:40:45.562316   36780 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:40:45.562435   36780 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:40:45.562541   36780 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:40:45.641929   36780 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:40:45.655710   36780 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:375: status says not all three control-plane nodes are present: args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr": ha-333994
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-333994-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-333994-m03
type: Worker
host: Running
kubelet: Running

                                                
                                                
ha_test.go:378: status says not three hosts are running: args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr": ha-333994
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-333994-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-333994-m03
type: Worker
host: Running
kubelet: Running

                                                
                                                
ha_test.go:381: status says not three kubelets are running: args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr": ha-333994
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-333994-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-333994-m03
type: Worker
host: Running
kubelet: Running

                                                
                                                
ha_test.go:384: status says not two apiservers are running: args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr": ha-333994
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured

                                                
                                                
ha-333994-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-333994-m03
type: Worker
host: Running
kubelet: Running

                                                
                                                
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/StopSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/StopSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.219406683s)
helpers_test.go:252: TestMultiControlPlane/serial/StopSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       13 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       13 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       13 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       14 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       14 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       14 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       14 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       14 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       14 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       14 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:43 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 14m                kube-proxy       
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m (x3 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           14m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                14m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:46 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      31s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         31s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 27s                kube-proxy       
	  Normal  NodeHasSufficientMemory  31s (x2 over 31s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    31s (x2 over 31s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     31s (x2 over 31s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  31s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           27s                node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                12s                kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	
	
	==> kernel <==
	 17:40:46 up 15 min,  0 users,  load average: 0.30, 0.26, 0.16
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:39:36.593222       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:36.593331       1 main.go:303] handling current node
	I0717 17:39:46.601179       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:46.601359       1 main.go:303] handling current node
	I0717 17:39:56.594724       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:56.594776       1 main.go:303] handling current node
	I0717 17:40:06.602658       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:06.602795       1 main.go:303] handling current node
	I0717 17:40:16.593559       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:16.593631       1 main.go:303] handling current node
	I0717 17:40:16.593651       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:16.593660       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:16.594519       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.1.0/24 Src: <nil> Gw: 192.168.39.197 Flags: [] Table: 0} 
	I0717 17:40:26.593205       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:26.593326       1 main.go:303] handling current node
	I0717 17:40:26.593353       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:26.593491       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:36.593114       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:36.593470       1 main.go:303] handling current node
	I0717 17:40:36.593560       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:36.593643       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:46.594058       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:46.594094       1 main.go:303] handling current node
	I0717 17:40:46.594107       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:46.594112       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	Jul 17 17:39:16 ha-333994 kubelet[1321]: E0717 17:39:16.468909    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:40:16 ha-333994 kubelet[1321]: E0717 17:40:16.471379    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/StopSecondaryNode]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  3m31s (x3 over 13m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  4s (x2 over 13s)     default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/StopSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/StopSecondaryNode (3.61s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (2.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:390: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
ha_test.go:413: expected profile "ha-333994" in json of 'profile list' to have "Degraded" status but have "Stopped" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-333994\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-333994\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"kvm2\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":
1,\"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-333994\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.168.39.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"containerd\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.168.39.180\",\"Port\":8443,
\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.168.39.127\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.168.39.197\",\"Port\":0,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-
device-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/home/jenkins:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"Disabl
eOptimizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-linux-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.163812453s)
helpers_test.go:252: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       13 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       14 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       14 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       14 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       14 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       14 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       14 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       14 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       14 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       14 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       14 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:43 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:38:01 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     14m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 14m                kube-proxy       
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m (x4 over 14m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m (x3 over 14m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 14m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  14m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    14m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     14m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           14m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                14m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:40:46 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:40:46 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      33s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         33s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 29s                kube-proxy       
	  Normal  NodeHasSufficientMemory  33s (x2 over 33s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    33s (x2 over 33s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     33s (x2 over 33s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  33s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           29s                node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                14s                kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.795898Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796088Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 1"}
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	
	
	==> kernel <==
	 17:40:48 up 15 min,  0 users,  load average: 0.30, 0.26, 0.16
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:39:36.593222       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:36.593331       1 main.go:303] handling current node
	I0717 17:39:46.601179       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:46.601359       1 main.go:303] handling current node
	I0717 17:39:56.594724       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:39:56.594776       1 main.go:303] handling current node
	I0717 17:40:06.602658       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:06.602795       1 main.go:303] handling current node
	I0717 17:40:16.593559       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:16.593631       1 main.go:303] handling current node
	I0717 17:40:16.593651       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:16.593660       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:16.594519       1 routes.go:62] Adding route {Ifindex: 0 Dst: 10.244.1.0/24 Src: <nil> Gw: 192.168.39.197 Flags: [] Table: 0} 
	I0717 17:40:26.593205       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:26.593326       1 main.go:303] handling current node
	I0717 17:40:26.593353       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:26.593491       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:36.593114       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:36.593470       1 main.go:303] handling current node
	I0717 17:40:36.593560       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:36.593643       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:40:46.594058       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:40:46.594094       1 main.go:303] handling current node
	I0717 17:40:46.594107       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:40:46.594112       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:36:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:36:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:37:16 ha-333994 kubelet[1321]: E0717 17:37:16.469310    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:37:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:37:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:16 ha-333994 kubelet[1321]: E0717 17:38:16.469271    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:38:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:38:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:38:36 ha-333994 kubelet[1321]: E0717 17:38:36.696894    1321 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.122.156:43908->192.168.122.156:10010: write tcp 192.168.122.156:43908->192.168.122.156:10010: write: broken pipe
	Jul 17 17:38:37 ha-333994 kubelet[1321]: E0717 17:38:37.471144    1321 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.122.156:43918->192.168.122.156:10010: read tcp 192.168.122.156:43918->192.168.122.156:10010: read: connection reset by peer
	Jul 17 17:39:16 ha-333994 kubelet[1321]: E0717 17:39:16.468909    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:39:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:39:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:40:16 ha-333994 kubelet[1321]: E0717 17:40:16.471379    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:40:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:40:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  3m33s (x3 over 13m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  6s (x2 over 15s)     default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (2.11s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (314.25s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:420: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 node start m02 -v=7 --alsologtostderr
E0717 17:42:52.134221   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:44:41.796985   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
ha_test.go:420: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 node start m02 -v=7 --alsologtostderr: exit status 80 (4m18.255014614s)

                                                
                                                
-- stdout --
	* Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	* Restarting existing kvm2 VM for "ha-333994-m02" ...
	* Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	* Enabled addons: 
	* Verifying Kubernetes components...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:40:49.743931   37081 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:40:49.744399   37081 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:49.744451   37081 out.go:304] Setting ErrFile to fd 2...
	I0717 17:40:49.744468   37081 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:40:49.744961   37081 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:40:49.745606   37081 mustload.go:65] Loading cluster: ha-333994
	I0717 17:40:49.745934   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:40:49.746306   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:49.746344   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:49.761593   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44971
	I0717 17:40:49.761990   37081 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:49.762497   37081 main.go:141] libmachine: Using API Version  1
	I0717 17:40:49.762518   37081 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:49.762807   37081 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:49.762989   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	W0717 17:40:49.764368   37081 host.go:58] "ha-333994-m02" host status: Stopped
	I0717 17:40:49.766394   37081 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:40:49.767693   37081 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:40:49.767729   37081 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:40:49.767750   37081 cache.go:56] Caching tarball of preloaded images
	I0717 17:40:49.767843   37081 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:40:49.767857   37081 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:40:49.768015   37081 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:40:49.768263   37081 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:40:49.768320   37081 start.go:364] duration metric: took 30.3µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:40:49.768338   37081 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:40:49.768348   37081 fix.go:54] fixHost starting: m02
	I0717 17:40:49.768726   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:40:49.768756   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:40:49.783201   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40649
	I0717 17:40:49.783660   37081 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:40:49.784131   37081 main.go:141] libmachine: Using API Version  1
	I0717 17:40:49.784152   37081 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:40:49.784425   37081 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:40:49.784622   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:40:49.784799   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:40:49.786164   37081 fix.go:112] recreateIfNeeded on ha-333994-m02: state=Stopped err=<nil>
	I0717 17:40:49.786187   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	W0717 17:40:49.786322   37081 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:40:49.788292   37081 out.go:177] * Restarting existing kvm2 VM for "ha-333994-m02" ...
	I0717 17:40:49.789383   37081 main.go:141] libmachine: (ha-333994-m02) Calling .Start
	I0717 17:40:49.789531   37081 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:40:49.790274   37081 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:40:49.790623   37081 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:40:49.790991   37081 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:40:49.791681   37081 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:40:50.968744   37081 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:40:50.969648   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:50.970132   37081 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:40:50.970159   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:50.970169   37081 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:40:50.970557   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:40:50.970578   37081 main.go:141] libmachine: (ha-333994-m02) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"}
	I0717 17:40:50.970590   37081 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:40:50.970614   37081 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:40:50.970625   37081 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:40:50.972500   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:50.972819   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:40:50.972843   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:40:50.972943   37081 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:40:50.973032   37081 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:40:50.973064   37081 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:40:50.973093   37081 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:40:50.973103   37081 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:41:02.101982   37081 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:41:02.102417   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:41:02.103028   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:41:02.105503   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.105914   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.105956   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.106221   37081 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:41:02.106401   37081 machine.go:94] provisionDockerMachine start ...
	I0717 17:41:02.106417   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:02.106633   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.108632   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.108946   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.108970   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.109089   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:02.109246   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.109376   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.109486   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:02.109640   37081 main.go:141] libmachine: Using SSH client type: native
	I0717 17:41:02.109835   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:41:02.109848   37081 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:41:02.210732   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:41:02.210765   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:41:02.211004   37081 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:41:02.211027   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:41:02.211210   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.213859   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.214231   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.214256   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.214420   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:02.214628   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.214793   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.214928   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:02.215079   37081 main.go:141] libmachine: Using SSH client type: native
	I0717 17:41:02.215234   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:41:02.215244   37081 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:41:02.329074   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:41:02.329101   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.332004   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.332346   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.332374   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.332529   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:02.332708   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.332884   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.333017   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:02.333200   37081 main.go:141] libmachine: Using SSH client type: native
	I0717 17:41:02.333368   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:41:02.333392   37081 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:41:02.439612   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:41:02.439645   37081 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:41:02.439684   37081 buildroot.go:174] setting up certificates
	I0717 17:41:02.439694   37081 provision.go:84] configureAuth start
	I0717 17:41:02.439712   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:41:02.439966   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:41:02.442371   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.442744   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.442778   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.442862   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.444931   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.445238   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.445271   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.445388   37081 provision.go:143] copyHostCerts
	I0717 17:41:02.445424   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:41:02.445459   37081 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:41:02.445471   37081 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:41:02.445530   37081 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:41:02.445619   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:41:02.445642   37081 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:41:02.445649   37081 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:41:02.445679   37081 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:41:02.445740   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:41:02.445755   37081 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:41:02.445763   37081 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:41:02.445784   37081 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:41:02.445841   37081 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:41:02.589008   37081 provision.go:177] copyRemoteCerts
	I0717 17:41:02.589063   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:41:02.589084   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.591663   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.592005   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.592035   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.592250   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:02.592429   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.592584   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:02.592695   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:41:02.676421   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:41:02.676497   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:41:02.702202   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:41:02.702280   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:41:02.726723   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:41:02.726785   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0717 17:41:02.750662   37081 provision.go:87] duration metric: took 310.952949ms to configureAuth
	I0717 17:41:02.750686   37081 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:41:02.750896   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:41:02.750908   37081 machine.go:97] duration metric: took 644.496018ms to provisionDockerMachine
	I0717 17:41:02.750915   37081 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:41:02.750927   37081 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:41:02.750956   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:02.751263   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:41:02.751291   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.753491   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.753841   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.753864   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.753996   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:02.754173   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.754338   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:02.754476   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:41:02.832822   37081 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:41:02.837161   37081 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:41:02.837181   37081 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:41:02.837256   37081 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:41:02.837354   37081 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:41:02.837368   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:41:02.837481   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:41:02.846992   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:41:02.870624   37081 start.go:296] duration metric: took 119.696511ms for postStartSetup
	I0717 17:41:02.870658   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:02.870955   37081 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:41:02.870988   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:02.873305   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.873658   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:02.873695   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:02.873847   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:02.874026   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:02.874196   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:02.874348   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:41:02.952952   37081 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:41:02.953020   37081 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:41:03.009383   37081 fix.go:56] duration metric: took 13.241028492s for fixHost
	I0717 17:41:03.009422   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:03.012020   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.012379   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:03.012404   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.012612   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:03.012790   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:03.012929   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:03.013094   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:03.013227   37081 main.go:141] libmachine: Using SSH client type: native
	I0717 17:41:03.013381   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:41:03.013389   37081 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0717 17:41:03.114865   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238063.079993674
	
	I0717 17:41:03.114884   37081 fix.go:216] guest clock: 1721238063.079993674
	I0717 17:41:03.114891   37081 fix.go:229] Guest: 2024-07-17 17:41:03.079993674 +0000 UTC Remote: 2024-07-17 17:41:03.009406179 +0000 UTC m=+13.299862053 (delta=70.587495ms)
	I0717 17:41:03.114922   37081 fix.go:200] guest clock delta is within tolerance: 70.587495ms
	I0717 17:41:03.114927   37081 start.go:83] releasing machines lock for "ha-333994-m02", held for 13.346596861s
	I0717 17:41:03.114944   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:03.115188   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:41:03.117612   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.117941   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:03.117964   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.118085   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:03.118576   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:03.118728   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:41:03.118825   37081 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:41:03.118860   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:03.118974   37081 ssh_runner.go:195] Run: systemctl --version
	I0717 17:41:03.118994   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:41:03.121310   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.121527   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.121705   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:03.121729   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.121849   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:03.121867   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:03.121884   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:03.122012   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:41:03.122084   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:03.122191   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:41:03.122369   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:03.122399   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:41:03.122523   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:41:03.122564   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:41:03.227400   37081 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:41:03.233219   37081 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:41:03.233281   37081 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:41:03.249175   37081 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:41:03.249204   37081 start.go:495] detecting cgroup driver to use...
	I0717 17:41:03.249270   37081 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:41:03.273876   37081 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:41:03.287441   37081 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:41:03.287508   37081 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:41:03.302143   37081 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:41:03.315989   37081 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:41:03.429297   37081 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:41:03.577522   37081 docker.go:233] disabling docker service ...
	I0717 17:41:03.577615   37081 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:41:03.592287   37081 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:41:03.604967   37081 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:41:03.745354   37081 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:41:03.870943   37081 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:41:03.891622   37081 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:41:03.910637   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:41:03.921410   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:41:03.932379   37081 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:41:03.932468   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:41:03.943824   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:41:03.954702   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:41:03.965664   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:41:03.976018   37081 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:41:03.986690   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:41:03.996765   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:41:04.007054   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:41:04.017289   37081 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:41:04.026360   37081 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:41:04.026417   37081 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:41:04.039290   37081 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:41:04.048597   37081 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:41:04.165587   37081 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:41:04.194954   37081 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:41:04.195028   37081 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:41:04.199446   37081 retry.go:31] will retry after 921.252694ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:41:05.121509   37081 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:41:05.127000   37081 start.go:563] Will wait 60s for crictl version
	I0717 17:41:05.127053   37081 ssh_runner.go:195] Run: which crictl
	I0717 17:41:05.130790   37081 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:41:05.171609   37081 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:41:05.171678   37081 ssh_runner.go:195] Run: containerd --version
	I0717 17:41:05.199712   37081 ssh_runner.go:195] Run: containerd --version
	I0717 17:41:05.227272   37081 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:41:05.228800   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:41:05.231335   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:05.231812   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:41:05.231840   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:41:05.232000   37081 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:41:05.236169   37081 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:41:05.248698   37081 mustload.go:65] Loading cluster: ha-333994
	I0717 17:41:05.248959   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:41:05.249245   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:41:05.249280   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:41:05.264378   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38343
	I0717 17:41:05.264857   37081 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:41:05.265306   37081 main.go:141] libmachine: Using API Version  1
	I0717 17:41:05.265326   37081 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:41:05.265616   37081 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:41:05.265794   37081 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:41:05.267403   37081 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:41:05.267735   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:41:05.267774   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:41:05.281710   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39581
	I0717 17:41:05.282146   37081 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:41:05.282650   37081 main.go:141] libmachine: Using API Version  1
	I0717 17:41:05.282670   37081 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:41:05.282954   37081 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:41:05.283107   37081 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:41:05.283258   37081 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:41:05.283270   37081 certs.go:194] generating shared ca certs ...
	I0717 17:41:05.283286   37081 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:41:05.283466   37081 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:41:05.283521   37081 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:41:05.283533   37081 certs.go:256] generating profile certs ...
	I0717 17:41:05.283647   37081 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:41:05.283718   37081 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:41:05.283774   37081 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:41:05.283788   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:41:05.283804   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:41:05.283824   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:41:05.283839   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:41:05.283855   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:41:05.283872   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:41:05.283888   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:41:05.283904   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:41:05.283955   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:41:05.283998   37081 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:41:05.284008   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:41:05.284044   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:41:05.284072   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:41:05.284096   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:41:05.284160   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:41:05.284195   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:41:05.284218   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:41:05.284235   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:41:05.284263   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:41:05.286976   37081 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:41:05.287365   37081 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:41:05.287388   37081 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:41:05.287485   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:41:05.287637   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:41:05.287766   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:41:05.287887   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:41:05.366513   37081 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0717 17:41:05.371830   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:41:05.384549   37081 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0717 17:41:05.388961   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:41:05.399728   37081 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:41:05.404176   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:41:05.416124   37081 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:41:05.420804   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:41:05.435197   37081 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:41:05.440207   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:41:05.453379   37081 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0717 17:41:05.458855   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:41:05.471182   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:41:05.500866   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:41:05.526100   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:41:05.550463   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:41:05.575524   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:41:05.600591   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:41:05.627676   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:41:05.651870   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:41:05.677550   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:41:05.701832   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:41:05.725559   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:41:05.754176   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:41:05.770797   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:41:05.787548   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:41:05.804053   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:41:05.820440   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:41:05.837358   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:41:05.854104   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:41:05.871924   37081 ssh_runner.go:195] Run: openssl version
	I0717 17:41:05.877834   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:41:05.888389   37081 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:41:05.892833   37081 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:41:05.892889   37081 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:41:05.898728   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:41:05.909438   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:41:05.919919   37081 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:41:05.924346   37081 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:41:05.924420   37081 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:41:05.929890   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:41:05.940442   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:41:05.951137   37081 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:41:05.955755   37081 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:41:05.955823   37081 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:41:05.961522   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:41:05.973007   37081 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:41:05.977019   37081 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:41:05.977078   37081 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:41:05.977185   37081 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:41:05.977223   37081 kube-vip.go:115] generating kube-vip config ...
	I0717 17:41:05.977266   37081 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:41:05.993693   37081 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:41:05.993815   37081 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:41:05.993890   37081 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:41:06.004772   37081 binaries.go:47] Didn't find k8s binaries: didn't find preexisting kubelet
	Initiating transfer...
	I0717 17:41:06.004860   37081 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:41:06.015193   37081 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256
	I0717 17:41:06.015201   37081 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256
	I0717 17:41:06.015224   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm -> /var/lib/minikube/binaries/v1.30.2/kubeadm
	I0717 17:41:06.015241   37081 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:41:06.015201   37081 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:41:06.015307   37081 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubeadm
	I0717 17:41:06.015314   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:41:06.015377   37081 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:41:06.030872   37081 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubeadm': No such file or directory
	I0717 17:41:06.030902   37081 ssh_runner.go:356] copy: skipping /var/lib/minikube/binaries/v1.30.2/kubectl (exists)
	I0717 17:41:06.030872   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet -> /var/lib/minikube/binaries/v1.30.2/kubelet
	I0717 17:41:06.030918   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm --> /var/lib/minikube/binaries/v1.30.2/kubeadm (50249880 bytes)
	I0717 17:41:06.031007   37081 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubelet
	I0717 17:41:06.053246   37081 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.30.2/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubelet': No such file or directory
	I0717 17:41:06.053293   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet --> /var/lib/minikube/binaries/v1.30.2/kubelet (100124920 bytes)
	I0717 17:41:06.744852   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0717 17:41:06.754277   37081 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0717 17:41:06.770802   37081 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:41:06.787285   37081 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:41:06.803568   37081 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:41:06.807750   37081 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:41:06.819602   37081 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:41:06.926111   37081 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:41:06.948115   37081 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:41:06.948208   37081 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:41:06.948327   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:41:06.950375   37081 out.go:177] * Enabled addons: 
	I0717 17:41:06.950381   37081 out.go:177] * Verifying Kubernetes components...
	I0717 17:41:06.951865   37081 addons.go:510] duration metric: took 3.670088ms for enable addons: enabled=[]
	I0717 17:41:06.951955   37081 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:41:07.093263   37081 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:41:07.948216   37081 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:41:07.948479   37081 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0717 17:41:07.948551   37081 kubeadm.go:483] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.180:8443
	I0717 17:41:07.948937   37081 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:41:07.949083   37081 node_ready.go:35] waiting up to 6m0s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:41:07.949172   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:07.949182   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:07.949192   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:07.949196   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:07.957921   37081 round_trippers.go:574] Response Status: 404 Not Found in 8 milliseconds
	I0717 17:41:08.449574   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:08.449604   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:08.449615   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:08.449620   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:08.451972   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:08.949592   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:08.949615   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:08.949623   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:08.949627   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:08.952028   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:09.449641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:09.449668   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:09.449678   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:09.449714   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:09.452245   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:09.949258   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:09.949281   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:09.949289   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:09.949295   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:09.951605   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:09.951703   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:10.450273   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:10.450296   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:10.450307   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:10.450311   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:10.452603   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:10.949271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:10.949293   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:10.949306   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:10.949310   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:10.951728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:11.449494   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:11.449520   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:11.449528   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:11.449532   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:11.451705   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:11.949403   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:11.949425   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:11.949433   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:11.949437   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:11.951816   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:11.951901   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:12.449412   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:12.449441   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:12.449452   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:12.449458   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:12.451824   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:12.949334   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:12.949356   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:12.949363   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:12.949368   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:12.951504   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:13.450291   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:13.450319   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:13.450329   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:13.450334   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:13.452660   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:13.949312   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:13.949332   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:13.949339   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:13.949343   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:13.951644   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:14.449301   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:14.449326   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:14.449331   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:14.449335   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:14.451960   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:14.452082   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:14.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:14.949609   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:14.949615   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:14.949621   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:14.952057   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:15.449985   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:15.450015   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:15.450041   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:15.450046   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:15.452325   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:15.950081   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:15.950109   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:15.950135   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:15.950141   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:15.952228   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:16.449951   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:16.449972   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:16.449978   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:16.449981   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:16.455720   37081 round_trippers.go:574] Response Status: 404 Not Found in 5 milliseconds
	I0717 17:41:16.455837   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:16.949409   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:16.949431   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:16.949440   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:16.949447   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:16.951499   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:17.450281   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:17.450311   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:17.450322   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:17.450327   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:17.452453   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:17.950231   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:17.950250   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:17.950260   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:17.950286   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:17.952495   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:18.450227   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:18.450250   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:18.450261   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:18.450267   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:18.452571   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:18.949247   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:18.949272   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:18.949281   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:18.949285   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:18.951540   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:18.951643   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:19.450258   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:19.450281   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:19.450288   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:19.450293   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:19.452577   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:19.949564   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:19.949586   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:19.949594   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:19.949599   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:19.952593   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:20.450276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:20.450301   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:20.450312   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:20.450319   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:20.452399   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:20.950152   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:20.950174   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:20.950183   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:20.950188   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:20.952469   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:20.952566   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:21.450205   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:21.450266   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:21.450275   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:21.450278   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:21.452554   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:21.950280   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:21.950303   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:21.950311   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:21.950315   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:21.952523   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:22.450262   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:22.450285   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:22.450293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:22.450297   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:22.452393   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:22.949997   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:22.950024   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:22.950034   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:22.950040   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:22.952355   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:23.450087   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:23.450110   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:23.450131   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:23.450134   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:23.452519   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:23.452597   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:23.950271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:23.950300   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:23.950308   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:23.950312   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:23.952596   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:24.450135   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:24.450161   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:24.450173   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:24.450181   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:24.452581   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:24.949255   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:24.949277   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:24.949284   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:24.949287   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:24.951637   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:25.450074   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:25.450096   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:25.450103   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:25.450107   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:25.452622   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:25.452710   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:25.949273   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:25.949295   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:25.949303   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:25.949307   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:25.951790   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:26.449458   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:26.449481   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:26.449489   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:26.449492   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:26.451617   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:26.949266   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:26.949287   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:26.949295   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:26.949298   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:26.951455   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:27.450199   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:27.450220   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:27.450227   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:27.450232   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:27.452554   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:27.950193   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:27.950213   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:27.950221   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:27.950226   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:27.952490   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:27.952609   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:28.449743   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:28.449765   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:28.449772   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:28.449775   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:28.452096   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:28.949804   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:28.949831   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:28.949841   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:28.949847   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:28.952340   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:29.450126   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:29.450151   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:29.450162   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:29.450165   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:29.452229   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:29.950141   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:29.950166   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:29.950178   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:29.950184   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:29.952611   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:29.952728   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:30.449247   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:30.449268   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:30.449275   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:30.449279   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:30.451962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:30.950279   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:30.950317   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:30.950326   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:30.950331   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:30.952449   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:31.450261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:31.450286   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:31.450294   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:31.450300   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:31.452636   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:31.950035   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:31.950063   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:31.950074   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:31.950082   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:31.952680   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:31.952798   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:32.449412   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:32.449434   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:32.449441   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:32.449445   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:32.451648   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:32.949364   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:32.949386   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:32.949394   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:32.949398   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:32.951662   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:33.449339   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:33.449358   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:33.449364   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:33.449369   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:33.452011   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:33.949709   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:33.949730   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:33.949738   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:33.949742   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:33.951940   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:34.449328   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:34.449368   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:34.449377   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:34.449381   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:34.451572   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:34.451679   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:34.949348   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:34.949371   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:34.949379   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:34.949382   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:34.951712   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:35.449318   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:35.449343   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:35.449354   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:35.449359   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:35.451749   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:35.949457   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:35.949480   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:35.949486   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:35.949492   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:35.952154   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:36.449877   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:36.449904   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:36.449915   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:36.449920   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:36.452639   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:36.452763   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:36.949281   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:36.949305   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:36.949313   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:36.949316   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:36.951622   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:37.449403   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:37.449428   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:37.449440   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:37.449445   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:37.453253   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:41:37.950047   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:37.950075   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:37.950086   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:37.950090   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:37.952540   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:38.450296   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:38.450318   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:38.450326   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:38.450329   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:38.452789   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:38.452894   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:38.949454   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:38.949478   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:38.949489   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:38.949497   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:38.951996   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:39.449448   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:39.449498   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:39.449510   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:39.449538   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:39.452233   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:39.950110   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:39.950146   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:39.950157   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:39.950165   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:39.952408   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:40.450013   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:40.450036   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:40.450044   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:40.450047   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:40.452572   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:40.949272   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:40.949296   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:40.949304   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:40.949308   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:40.951587   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:40.951701   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:41.449270   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:41.449293   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:41.449300   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:41.449306   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:41.451720   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:41.949374   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:41.949418   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:41.949431   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:41.949437   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:41.951663   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:42.449983   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:42.450010   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:42.450022   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:42.450029   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:42.452815   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:42.949535   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:42.949558   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:42.949569   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:42.949577   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:42.951867   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:42.952082   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:43.449639   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:43.449661   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:43.449667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:43.449671   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:43.452781   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:41:43.949332   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:43.949355   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:43.949362   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:43.949366   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:43.951818   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:44.449441   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:44.449480   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:44.449488   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:44.449492   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:44.451747   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:44.949229   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:44.949252   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:44.949263   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:44.949267   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:44.951587   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:45.450308   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:45.450337   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:45.450348   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:45.450365   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:45.453102   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:45.453209   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:45.949832   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:45.949854   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:45.949861   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:45.949865   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:45.952032   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:46.449696   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:46.449718   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:46.449726   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:46.449739   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:46.451961   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:46.949634   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:46.949659   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:46.949667   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:46.949672   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:46.952207   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:47.449968   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:47.449993   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:47.450000   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:47.450004   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:47.452167   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:47.949915   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:47.949937   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:47.949945   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:47.949950   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:47.952143   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:47.952245   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:48.449880   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:48.449901   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:48.449909   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:48.449914   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:48.452275   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:48.950008   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:48.950029   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:48.950036   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:48.950040   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:48.952295   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:49.449996   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:49.450017   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:49.450026   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:49.450029   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:49.453283   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:41:49.949327   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:49.949352   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:49.949363   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:49.949368   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:49.951627   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:50.449265   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:50.449285   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:50.449293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:50.449297   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:50.451728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:50.451845   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:50.949409   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:50.949433   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:50.949442   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:50.949445   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:50.951888   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:51.449551   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:51.449574   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:51.449581   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:51.449584   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:51.452113   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:51.949853   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:51.949874   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:51.949882   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:51.949886   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:51.952042   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:52.449276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:52.449297   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:52.449308   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:52.449312   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:52.451308   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:41:52.950028   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:52.950050   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:52.950057   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:52.950061   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:52.952425   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:52.952547   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:53.450260   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:53.450290   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:53.450299   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:53.450305   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:53.453819   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:41:53.949661   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:53.949689   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:53.949709   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:53.949719   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:53.952584   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:54.449284   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:54.449307   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:54.449317   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:54.449322   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:54.451861   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:54.949602   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:54.949627   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:54.949639   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:54.949646   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:54.951952   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:55.449628   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:55.449650   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:55.449659   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:55.449662   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:55.452239   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:55.452470   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:55.950020   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:55.950042   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:55.950049   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:55.950053   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:55.952875   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:56.449458   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:56.449499   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:56.449510   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:56.449515   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:56.452058   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:56.949754   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:56.949830   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:56.949847   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:56.949855   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:56.952745   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:57.449390   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:57.449415   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:57.449427   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:57.449431   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:57.451956   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:57.949687   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:57.949709   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:57.949716   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:57.949719   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:57.951934   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:57.952044   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:41:58.449609   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:58.449631   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:58.449638   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:58.449642   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:58.452007   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:58.949259   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:58.949281   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:58.949289   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:58.949295   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:58.952116   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:59.449714   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:59.449735   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:59.449743   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:59.449747   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:59.452490   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:41:59.949340   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:41:59.949362   37081 round_trippers.go:469] Request Headers:
	I0717 17:41:59.949370   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:41:59.949373   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:41:59.951421   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:00.450035   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:00.450057   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:00.450065   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:00.450069   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:00.452296   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:00.452414   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:00.950012   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:00.950035   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:00.950045   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:00.950051   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:00.952350   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:01.450104   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:01.450149   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:01.450160   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:01.450166   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:01.453050   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:01.949395   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:01.949416   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:01.949424   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:01.949427   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:01.951756   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:02.449445   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:02.449466   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:02.449474   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:02.449477   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:02.452034   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:02.949715   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:02.949743   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:02.949751   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:02.949755   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:02.952023   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:02.952116   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:03.449464   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:03.449484   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:03.449492   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:03.449498   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:03.452501   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:03.949243   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:03.949265   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:03.949273   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:03.949275   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:03.951456   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:04.450219   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:04.450241   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:04.450250   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:04.450253   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:04.452663   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:04.949323   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:04.949345   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:04.949353   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:04.949358   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:04.952407   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:42:04.952512   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:05.450134   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:05.450159   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:05.450170   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:05.450176   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:05.452421   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:05.950189   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:05.950212   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:05.950223   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:05.950230   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:05.952631   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:06.449284   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:06.449307   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:06.449315   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:06.449319   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:06.453013   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:42:06.949776   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:06.949800   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:06.949812   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:06.949817   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:06.952185   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:07.449923   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:07.449945   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:07.449952   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:07.449956   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:07.452200   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:07.452301   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:07.950008   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:07.950036   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:07.950045   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:07.950050   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:07.952799   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:08.449430   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:08.449454   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:08.449461   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:08.449466   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:08.451657   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:08.949300   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:08.949323   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:08.949330   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:08.949333   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:08.951568   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:09.450264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:09.450288   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:09.450295   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:09.450299   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:09.452971   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:09.453075   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:09.949976   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:09.949998   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:09.950005   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:09.950025   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:09.952330   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:10.449997   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:10.450017   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:10.450025   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:10.450029   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:10.452873   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:10.949498   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:10.949521   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:10.949531   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:10.949536   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:10.952069   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:11.449766   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:11.449787   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:11.449796   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:11.449800   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:11.452274   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:11.949972   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:11.949996   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:11.950006   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:11.950012   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:11.952316   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:11.952430   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:12.450062   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:12.450094   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:12.450102   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:12.450107   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:12.452404   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:12.950102   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:12.950134   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:12.950144   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:12.950151   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:12.952144   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:42:13.449891   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:13.449913   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:13.449924   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:13.449929   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:13.452466   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:13.950255   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:13.950279   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:13.950289   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:13.950293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:13.952781   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:13.952879   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:14.449447   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:14.449469   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:14.449477   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:14.449481   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:14.451992   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:14.949728   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:14.949753   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:14.949763   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:14.949768   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:14.952599   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:15.450266   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:15.450288   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:15.450299   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:15.450304   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:15.453204   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:15.949940   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:15.949962   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:15.949970   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:15.949973   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:15.952339   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:16.450085   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:16.450108   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:16.450144   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:16.450151   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:16.452619   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:16.452762   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:16.949271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:16.949294   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:16.949318   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:16.949324   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:16.951767   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:17.449443   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:17.449465   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:17.449473   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:17.449478   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:17.451706   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:17.949361   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:17.949383   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:17.949391   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:17.949396   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:17.951886   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:18.449542   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:18.449566   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:18.449577   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:18.449583   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:18.451912   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:18.949553   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:18.949586   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:18.949596   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:18.949600   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:18.951917   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:18.952028   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:19.449581   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:19.449605   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:19.449616   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:19.449622   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:19.452330   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:19.949407   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:19.949431   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:19.949439   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:19.949447   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:19.951804   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:20.449423   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:20.449446   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:20.449454   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:20.449458   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:20.451926   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:20.949567   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:20.949592   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:20.949600   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:20.949604   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:20.951938   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:20.952072   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:21.449618   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:21.449639   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:21.449647   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:21.449651   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:21.451980   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:21.949635   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:21.949665   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:21.949676   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:21.949680   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:21.952119   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:22.449813   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:22.449834   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:22.449842   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:22.449845   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:22.452383   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:22.950149   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:22.950174   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:22.950183   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:22.950186   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:22.952425   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:22.952558   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:23.450160   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:23.450181   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:23.450205   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:23.450210   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:23.452807   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:23.949447   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:23.949468   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:23.949476   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:23.949481   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:23.951694   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:24.449386   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:24.449410   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:24.449417   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:24.449422   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:24.451935   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:24.950064   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:24.950085   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:24.950093   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:24.950096   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:24.952825   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:24.952949   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:25.449357   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:25.449379   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:25.449387   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:25.449391   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:25.451640   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:25.949325   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:25.949350   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:25.949362   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:25.949369   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:25.951688   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:26.449332   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:26.449355   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:26.449365   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:26.449392   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:26.452355   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:26.950074   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:26.950099   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:26.950109   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:26.950134   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:26.953041   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:26.953166   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:27.449261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:27.449285   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:27.449293   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:27.449296   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:27.451952   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:27.949732   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:27.949751   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:27.949759   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:27.949763   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:27.952114   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:28.450003   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:28.450025   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:28.450049   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:28.450053   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:28.452455   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:28.950210   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:28.950232   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:28.950240   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:28.950244   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:28.952859   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:29.449739   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:29.449763   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:29.449773   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:29.449777   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:29.451965   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:29.452105   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:29.949898   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:29.949917   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:29.949924   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:29.949928   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:29.952501   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:30.450202   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:30.450222   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:30.450230   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:30.450235   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:30.452716   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:30.949329   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:30.949351   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:30.949359   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:30.949362   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:30.951464   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:31.450239   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:31.450262   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:31.450270   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:31.450274   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:31.452542   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:31.452672   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:31.950261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:31.950281   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:31.950289   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:31.950293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:31.952463   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:32.450189   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:32.450212   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:32.450219   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:32.450222   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:32.452565   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:32.949233   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:32.949264   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:32.949272   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:32.949276   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:32.951522   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:33.450243   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:33.450266   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:33.450275   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:33.450277   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:33.452546   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:33.950297   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:33.950320   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:33.950328   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:33.950331   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:33.952050   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:42:33.952167   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:34.449755   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:34.449777   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:34.449784   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:34.449788   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:34.452030   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:34.949974   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:34.950000   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:34.950009   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:34.950024   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:34.952958   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:35.449385   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:35.449409   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:35.449419   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:35.449424   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:35.452061   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:35.949709   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:35.949729   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:35.949743   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:35.949747   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:35.952326   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:35.952431   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:36.449818   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:36.449850   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:36.449862   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:36.449867   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:36.452678   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:36.949390   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:36.949415   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:36.949423   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:36.949428   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:36.951858   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:37.449504   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:37.449528   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:37.449535   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:37.449540   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:37.452005   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:37.949786   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:37.949809   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:37.949816   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:37.949821   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:37.951863   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:38.449617   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:38.449640   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:38.449647   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:38.449650   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:38.451786   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:38.451886   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:38.949432   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:38.949455   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:38.949463   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:38.949468   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:38.952153   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:39.450162   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:39.450188   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:39.450200   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:39.450208   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:39.452881   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:39.949935   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:39.949958   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:39.949964   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:39.949967   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:39.952181   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:40.449746   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:40.449772   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:40.449784   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:40.449789   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:40.452136   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:40.452234   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:40.949863   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:40.949884   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:40.949893   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:40.949898   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:40.952341   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:41.450082   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:41.450108   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:41.450127   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:41.450133   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:41.452540   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:41.950304   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:41.950339   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:41.950354   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:41.950359   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:41.952586   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:42.449269   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:42.449292   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:42.449303   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:42.449310   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:42.451834   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:42.949491   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:42.949518   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:42.949529   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:42.949538   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:42.951893   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:42.952034   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:43.449574   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:43.449600   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:43.449606   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:43.449611   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:43.452049   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:43.949752   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:43.949776   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:43.949785   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:43.949789   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:43.952469   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:44.450210   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:44.450232   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:44.450243   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:44.450248   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:44.458246   37081 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
	I0717 17:42:44.950040   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:44.950067   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:44.950079   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:44.950086   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:44.952904   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:44.953011   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:45.449243   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:45.449266   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:45.449274   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:45.449279   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:45.451684   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:45.949326   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:45.949346   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:45.949354   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:45.949359   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:45.952193   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:46.449901   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:46.449922   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:46.449930   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:46.449935   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:46.452037   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:46.949713   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:46.949735   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:46.949741   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:46.949746   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:46.952339   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:47.450093   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:47.450136   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:47.450149   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:47.450153   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:47.452888   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:47.453002   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:47.949514   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:47.949539   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:47.949547   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:47.949551   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:47.952084   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:48.449750   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:48.449774   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:48.449782   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:48.449788   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:48.451639   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:42:48.949299   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:48.949320   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:48.949327   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:48.949331   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:48.951780   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:49.449511   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:49.449541   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:49.449549   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:49.449554   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:49.452235   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:49.949586   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:49.949615   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:49.949627   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:49.949634   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:49.952160   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:49.952256   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:50.449669   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:50.449692   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:50.449698   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:50.449703   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:50.452052   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:50.949707   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:50.949732   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:50.949739   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:50.949743   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:50.952243   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:51.449991   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:51.450014   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:51.450023   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:51.450028   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:51.452520   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:51.950261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:51.950284   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:51.950293   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:51.950298   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:51.952661   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:51.952756   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:52.449337   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:52.449362   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:52.449370   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:52.449376   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:52.451885   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:52.949720   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:52.949750   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:52.949761   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:52.949767   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:52.952216   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:53.449991   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:53.450012   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:53.450021   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:53.450023   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:53.452656   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:53.949393   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:53.949417   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:53.949425   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:53.949428   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:53.951962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:54.449616   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:54.449637   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:54.449645   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:54.449650   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:54.451976   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:54.452072   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:54.949841   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:54.949863   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:54.949871   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:54.949876   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:54.952235   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:55.449784   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:55.449806   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:55.449813   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:55.449818   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:55.452656   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:55.949958   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:55.949989   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:55.950000   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:55.950007   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:55.953067   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:42:56.449762   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:56.449792   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:56.449801   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:56.449808   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:56.452280   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:56.452382   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:56.950004   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:56.950027   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:56.950053   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:56.950058   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:56.952350   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:57.450085   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:57.450106   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:57.450114   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:57.450126   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:57.452524   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:57.949237   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:57.949259   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:57.949268   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:57.949273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:57.951512   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:58.450248   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:58.450269   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:58.450276   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:58.450280   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:58.452532   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:58.452640   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:42:58.950235   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:58.950256   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:58.950264   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:58.950267   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:58.952459   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:59.450196   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:59.450218   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:59.450229   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:59.450234   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:59.452402   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:42:59.949216   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:42:59.949254   37081 round_trippers.go:469] Request Headers:
	I0717 17:42:59.949266   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:42:59.949271   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:42:59.952323   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:00.449960   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:00.449987   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:00.449998   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:00.450002   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:00.452516   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:00.950240   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:00.950262   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:00.950270   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:00.950275   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:00.953877   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:00.953996   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:01.449581   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:01.449614   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:01.449622   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:01.449627   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:01.452177   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:01.949892   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:01.949915   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:01.949922   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:01.949926   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:01.952590   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:02.449219   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:02.449252   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:02.449259   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:02.449264   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:02.451507   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:02.950265   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:02.950288   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:02.950297   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:02.950302   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:02.952577   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:03.449245   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:03.449268   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:03.449278   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:03.449284   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:03.451802   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:03.451894   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:03.949467   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:03.949489   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:03.949498   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:03.949504   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:03.951569   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:04.449233   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:04.449258   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:04.449268   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:04.449273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:04.451591   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:04.949228   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:04.949249   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:04.949256   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:04.949260   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:04.952224   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:05.449983   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:05.450009   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:05.450020   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:05.450026   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:05.452405   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:05.452513   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:05.950170   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:05.950190   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:05.950198   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:05.950205   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:05.952659   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:06.449329   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:06.449353   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:06.449361   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:06.449365   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:06.451851   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:06.949473   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:06.949515   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:06.949523   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:06.949530   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:06.952103   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:07.449761   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:07.449791   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:07.449802   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:07.449808   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:07.452032   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:07.949735   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:07.949758   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:07.949766   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:07.949769   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:07.953319   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:07.953438   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:08.450024   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:08.450047   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:08.450056   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:08.450059   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:08.452208   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:08.949907   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:08.949928   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:08.949936   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:08.949941   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:08.952344   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:09.450048   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:09.450069   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:09.450077   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:09.450080   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:09.452429   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:09.949367   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:09.949393   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:09.949406   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:09.949412   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:09.951593   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:10.450155   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:10.450181   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:10.450193   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:10.450197   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:10.452694   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:10.452818   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:10.949272   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:10.949296   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:10.949307   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:10.949313   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:10.951792   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:11.449438   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:11.449462   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:11.449473   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:11.449479   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:11.451715   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:11.949361   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:11.949383   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:11.949391   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:11.949395   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:11.951709   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:12.449339   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:12.449361   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:12.449369   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:12.449374   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:12.451777   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:12.949429   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:12.949456   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:12.949469   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:12.949475   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:12.951757   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:12.951900   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:13.449402   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:13.449424   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:13.449431   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:13.449435   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:13.451820   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:13.949514   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:13.949536   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:13.949544   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:13.949548   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:13.951751   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:14.449409   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:14.449430   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:14.449438   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:14.449443   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:14.451736   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:14.949496   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:14.949518   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:14.949525   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:14.949544   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:14.951847   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:14.951954   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:15.449514   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:15.449535   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:15.449551   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:15.449555   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:15.452399   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:15.950168   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:15.950189   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:15.950197   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:15.950201   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:15.952466   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:16.450211   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:16.450235   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:16.450242   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:16.450248   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:16.452886   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:16.949534   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:16.949559   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:16.949569   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:16.949577   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:16.952044   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:16.952163   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:17.449782   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:17.449806   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:17.449818   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:17.449823   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:17.452104   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:17.949825   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:17.949852   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:17.949863   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:17.949867   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:17.953070   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:18.449281   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:18.449303   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:18.449311   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:18.449314   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:18.451376   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:18.950155   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:18.950178   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:18.950186   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:18.950189   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:18.953193   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:18.953497   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:19.449703   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:19.449730   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:19.449738   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:19.449741   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:19.452061   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:19.950178   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:19.950199   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:19.950206   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:19.950212   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:19.952244   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:20.449855   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:20.449880   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:20.449892   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:20.449898   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:20.452427   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:20.950249   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:20.950277   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:20.950287   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:20.950296   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:20.953464   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:20.953576   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:21.450167   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:21.450189   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:21.450198   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:21.450221   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:21.452953   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:21.949607   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:21.949637   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:21.949645   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:21.949650   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:21.951991   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:22.449634   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:22.449655   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:22.449663   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:22.449667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:22.452165   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:22.949903   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:22.949926   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:22.949933   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:22.949936   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:22.952319   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:23.450026   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:23.450047   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:23.450055   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:23.450059   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:23.452411   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:23.452508   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:23.949674   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:23.949701   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:23.949711   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:23.949715   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:23.951729   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:43:24.449389   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:24.449412   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:24.449420   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:24.449425   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:24.452037   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:24.950146   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:24.950170   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:24.950182   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:24.950188   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:24.952447   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:25.450003   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:25.450040   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:25.450049   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:25.450056   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:25.452607   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:25.452697   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:25.949278   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:25.949304   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:25.949314   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:25.949319   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:25.951746   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:26.449357   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:26.449377   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:26.449384   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:26.449389   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:26.452345   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:26.950085   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:26.950106   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:26.950131   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:26.950136   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:26.952493   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:27.450218   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:27.450234   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:27.450242   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:27.450246   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:27.452891   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:27.453001   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:27.949641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:27.949665   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:27.949675   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:27.949680   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:27.952429   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:28.450202   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:28.450224   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:28.450233   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:28.450238   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:28.453125   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:28.949381   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:28.949406   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:28.949417   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:28.949420   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:28.960303   37081 round_trippers.go:574] Response Status: 404 Not Found in 10 milliseconds
	I0717 17:43:29.450096   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:29.450148   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:29.450160   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:29.450171   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:29.452703   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:29.949805   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:29.949830   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:29.949840   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:29.949845   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:29.951897   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:29.951993   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:30.449276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:30.449321   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:30.449329   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:30.449335   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:30.451746   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:30.949327   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:30.949349   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:30.949356   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:30.949360   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:30.951459   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:31.449264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:31.449285   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:31.449293   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:31.449299   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:31.451885   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:31.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:31.949613   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:31.949622   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:31.949626   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:31.952081   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:31.952199   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:32.449833   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:32.449857   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:32.449868   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:32.449876   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:32.452407   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:32.950175   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:32.950198   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:32.950207   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:32.950212   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:32.952414   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:33.450161   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:33.450208   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:33.450216   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:33.450220   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:33.452853   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:33.949508   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:33.949530   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:33.949538   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:33.949541   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:33.951578   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:34.449232   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:34.449255   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:34.449263   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:34.449266   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:34.451790   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:34.451921   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:34.949346   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:34.949370   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:34.949383   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:34.949394   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:34.951561   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:35.450238   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:35.450262   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:35.450270   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:35.450273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:35.452939   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:35.949621   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:35.949652   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:35.949663   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:35.949668   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:35.951728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:36.449403   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:36.449425   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:36.449432   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:36.449436   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:36.452967   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:36.453073   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:36.949625   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:36.949655   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:36.949667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:36.949676   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:36.953243   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:37.449998   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:37.450023   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:37.450031   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:37.450035   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:37.452305   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:37.950171   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:37.950191   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:37.950199   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:37.950202   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:37.953584   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:38.450337   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:38.450360   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:38.450370   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:38.450374   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:38.453142   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:38.453253   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:38.949850   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:38.949890   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:38.949898   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:38.949901   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:38.952105   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:39.449815   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:39.449860   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:39.449871   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:39.449875   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:39.452559   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:39.949566   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:39.949593   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:39.949602   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:39.949608   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:39.952007   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:40.449641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:40.449663   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:40.449671   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:40.449676   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:40.452044   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:40.949716   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:40.949739   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:40.949747   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:40.949751   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:40.952436   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:40.952607   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:41.450241   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:41.450267   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:41.450280   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:41.450285   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:41.452809   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:41.949507   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:41.949533   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:41.949552   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:41.949557   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:41.952213   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:42.450021   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:42.450041   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:42.450050   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:42.450055   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:42.453323   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:42.950057   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:42.950077   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:42.950084   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:42.950088   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:42.952462   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:43.450226   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:43.450249   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:43.450257   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:43.450260   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:43.452497   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:43.452606   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:43.950284   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:43.950304   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:43.950313   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:43.950317   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:43.952590   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:44.449264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:44.449289   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:44.449300   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:44.449307   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:44.451645   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:44.949270   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:44.949291   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:44.949299   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:44.949303   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:44.952004   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:45.449741   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:45.449765   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:45.449773   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:45.449776   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:45.452283   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:45.950097   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:45.950138   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:45.950155   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:45.950159   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:45.952273   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:45.952358   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:46.450072   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:46.450099   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:46.450109   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:46.450134   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:46.452717   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:46.949389   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:46.949410   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:46.949418   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:46.949421   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:46.952138   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:47.449897   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:47.449924   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:47.449934   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:47.449939   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:47.453058   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:43:47.949786   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:47.949821   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:47.949829   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:47.949834   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:47.952393   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:47.952494   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:48.450108   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:48.450148   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:48.450158   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:48.450164   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:48.453081   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:48.949991   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:48.950014   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:48.950023   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:48.950027   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:48.952072   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:49.449744   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:49.449768   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:49.449778   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:49.449785   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:49.452085   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:49.950032   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:49.950056   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:49.950066   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:49.950071   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:49.952408   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:50.450005   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:50.450028   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:50.450038   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:50.450042   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:50.452729   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:50.452814   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:50.949373   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:50.949394   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:50.949402   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:50.949406   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:50.951516   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:51.450300   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:51.450328   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:51.450338   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:51.450343   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:51.452766   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:51.949423   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:51.949443   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:51.949451   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:51.949455   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:51.951817   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:52.449453   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:52.449478   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:52.449488   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:52.449493   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:52.451781   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:52.949436   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:52.949463   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:52.949475   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:52.949480   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:52.951832   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:52.951947   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:53.449339   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:53.449363   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:53.449371   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:53.449380   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:53.452104   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:53.949859   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:53.949880   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:53.949888   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:53.949891   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:53.952510   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:54.450264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:54.450293   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:54.450304   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:54.450310   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:54.452913   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:54.949635   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:54.949657   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:54.949665   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:54.949670   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:54.952066   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:54.952157   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:55.449659   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:55.449690   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:55.449702   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:55.449710   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:55.452066   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:55.949743   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:55.949769   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:55.949781   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:55.949785   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:55.952471   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:56.449798   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:56.449819   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:56.449828   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:56.449832   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:56.452393   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:56.950161   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:56.950185   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:56.950193   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:56.950197   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:56.952650   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:56.952767   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:57.450285   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:57.450313   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:57.450325   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:57.450330   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:57.452799   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:57.949533   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:57.949554   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:57.949562   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:57.949565   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:57.951955   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:58.449776   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:58.449802   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:58.449814   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:58.449825   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:58.452100   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:58.949749   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:58.949771   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:58.949781   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:58.949787   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:58.952182   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:59.449929   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:59.449951   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:59.449959   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:59.449964   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:59.452235   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:43:59.452351   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:43:59.950255   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:43:59.950280   37081 round_trippers.go:469] Request Headers:
	I0717 17:43:59.950292   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:43:59.950300   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:43:59.952723   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:00.449270   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:00.449294   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:00.449304   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:00.449309   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:00.452444   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:44:00.950203   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:00.950225   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:00.950232   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:00.950236   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:00.952504   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:01.450253   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:01.450275   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:01.450282   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:01.450286   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:01.452728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:01.452839   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:01.949432   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:01.949459   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:01.949469   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:01.949474   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:01.951965   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:02.449629   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:02.449654   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:02.449663   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:02.449669   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:02.452190   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:02.949992   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:02.950013   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:02.950021   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:02.950025   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:02.952338   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:03.449636   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:03.449659   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:03.449669   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:03.449675   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:03.452455   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:03.950231   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:03.950254   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:03.950262   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:03.950266   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:03.952472   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:03.952579   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:04.450285   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:04.450310   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:04.450320   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:04.450323   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:04.452910   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:04.949616   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:04.949647   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:04.949660   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:04.949667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:04.952331   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:05.449809   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:05.449830   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:05.449838   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:05.449841   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:05.452782   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:05.949342   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:05.949364   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:05.949372   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:05.949375   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:05.951801   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:06.449432   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:06.449455   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:06.449463   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:06.449467   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:06.452191   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:06.452308   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:06.949911   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:06.949935   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:06.949945   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:06.949949   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:06.952099   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:07.449779   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:07.449800   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:07.449808   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:07.449811   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:07.451995   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:07.949744   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:07.949771   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:07.949782   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:07.949788   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:07.952436   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:08.450209   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:08.450233   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:08.450244   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:08.450250   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:08.453079   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:08.453177   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:08.949739   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:08.949762   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:08.949770   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:08.949774   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:08.951845   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:09.449385   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:09.449407   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:09.449415   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:09.449418   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:09.452156   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:09.950165   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:09.950185   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:09.950192   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:09.950198   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:09.952468   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:10.450275   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:10.450297   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:10.450305   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:10.450314   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:10.452652   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:10.949338   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:10.949360   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:10.949368   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:10.949371   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:10.951962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:10.952061   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:11.449611   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:11.449633   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:11.449640   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:11.449644   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:11.452208   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:11.950022   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:11.950045   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:11.950053   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:11.950057   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:11.952633   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:12.449271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:12.449294   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:12.449301   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:12.449305   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:12.452220   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:12.949986   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:12.950007   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:12.950019   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:12.950024   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:12.953013   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:12.953114   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:13.449706   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:13.449731   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:13.449738   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:13.449743   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:13.452118   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:13.949877   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:13.949900   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:13.949908   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:13.949913   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:13.952803   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:14.449525   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:14.449551   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:14.449563   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:14.449571   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:14.452506   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:14.949258   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:14.949282   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:14.949296   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:14.949302   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:14.951822   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:15.449584   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:15.449605   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:15.449613   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:15.449616   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:15.452927   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:44:15.453039   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:15.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:15.949607   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:15.949614   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:15.949617   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:15.952886   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:44:16.449537   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:16.449572   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:16.449580   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:16.449585   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:16.452021   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:16.949705   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:16.949737   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:16.949747   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:16.949754   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:16.952287   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:17.450010   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:17.450031   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:17.450039   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:17.450043   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:17.452650   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:17.949451   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:17.949475   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:17.949486   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:17.949491   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:17.953180   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:44:17.953292   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:18.449869   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:18.449901   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:18.449910   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:18.449914   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:18.452248   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:18.949974   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:18.949997   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:18.950007   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:18.950013   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:18.952145   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:19.449900   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:19.449921   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:19.449929   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:19.449934   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:19.452221   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:19.949192   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:19.949213   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:19.949221   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:19.949226   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:19.951591   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:20.450198   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:20.450220   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:20.450228   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:20.450232   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:20.452626   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:20.452741   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:20.949250   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:20.949270   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:20.949277   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:20.949282   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:20.951567   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:21.449273   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:21.449297   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:21.449304   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:21.449307   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:21.451675   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:21.949322   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:21.949341   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:21.949348   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:21.949353   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:21.951532   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:22.450299   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:22.450327   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:22.450338   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:22.450344   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:22.452758   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:22.452850   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:22.950011   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:22.950047   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:22.950058   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:22.950067   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:22.952399   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:23.450155   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:23.450184   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:23.450196   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:23.450199   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:23.452635   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:23.949282   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:23.949319   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:23.949327   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:23.949332   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:23.951473   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:24.450191   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:24.450214   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:24.450223   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:24.450227   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:24.452721   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:24.949245   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:24.949270   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:24.949277   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:24.949282   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:24.951300   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:24.951403   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:25.449870   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:25.449890   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:25.449898   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:25.449902   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:25.452707   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:25.949253   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:25.949276   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:25.949284   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:25.949289   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:25.952042   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:26.449709   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:26.449732   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:26.449747   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:26.449753   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:26.451889   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:26.949593   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:26.949617   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:26.949627   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:26.949631   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:26.951301   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:44:26.951462   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:27.449611   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:27.449635   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:27.449643   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:27.449648   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:27.452090   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:27.949962   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:27.949987   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:27.950005   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:27.950010   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:27.952081   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:28.449736   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:28.449761   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:28.449773   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:28.449779   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:28.451848   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:28.949509   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:28.949531   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:28.949539   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:28.949543   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:28.951983   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:28.952103   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:29.449597   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:29.449622   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:29.449630   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:29.449635   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:29.452069   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:29.950006   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:29.950029   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:29.950039   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:29.950043   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:29.952677   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:30.450202   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:30.450227   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:30.450239   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:30.450244   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:30.452848   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:30.949540   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:30.949562   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:30.949569   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:30.949573   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:30.951882   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:31.449606   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:31.449629   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:31.449639   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:31.449643   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:31.452075   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:31.452193   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:31.949707   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:31.949738   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:31.949749   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:31.949753   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:31.952409   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:32.450111   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:32.450144   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:32.450152   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:32.450155   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:32.452581   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:32.949254   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:32.949277   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:32.949287   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:32.949292   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:32.951662   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:33.449313   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:33.449356   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:33.449367   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:33.449373   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:33.451866   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:33.949526   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:33.949545   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:33.949553   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:33.949558   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:33.951561   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:44:33.951676   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:34.449236   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:34.449257   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:34.449266   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:34.449270   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:34.451490   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:34.949232   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:34.949253   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:34.949261   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:34.949265   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:34.951833   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:35.449463   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:35.449483   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:35.449490   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:35.449494   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:35.451788   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:35.949441   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:35.949465   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:35.949473   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:35.949477   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:35.951679   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:35.951777   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:36.449328   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:36.449349   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:36.449358   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:36.449361   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:36.451819   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:36.949469   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:36.949492   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:36.949500   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:36.949503   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:36.951962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:37.449641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:37.449662   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:37.449670   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:37.449674   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:37.451999   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:37.949721   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:37.949745   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:37.949752   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:37.949757   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:37.952096   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:37.952210   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:38.449359   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:38.449382   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:38.449390   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:38.449393   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:38.452127   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:38.949866   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:38.949890   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:38.949899   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:38.949904   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:38.952243   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:39.449602   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:39.449624   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:39.449632   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:39.449637   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:39.452170   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:39.950010   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:39.950030   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:39.950038   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:39.950043   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:39.952179   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:39.952260   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:40.449694   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:40.449715   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:40.449726   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:40.449733   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:40.452021   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:40.949767   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:40.949794   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:40.949805   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:40.949809   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:40.952308   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:41.450023   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:41.450045   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:41.450052   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:41.450061   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:41.452372   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:41.950133   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:41.950159   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:41.950169   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:41.950173   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:41.952343   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:41.952449   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:42.450089   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:42.450111   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:42.450129   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:42.450137   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:42.452441   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:42.950199   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:42.950223   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:42.950230   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:42.950234   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:42.952567   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:43.449207   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:43.449245   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:43.449255   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:43.449260   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:43.451661   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:43.949276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:43.949315   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:43.949323   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:43.949328   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:43.951510   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:44.450239   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:44.450265   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:44.450274   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:44.450278   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:44.452906   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:44.453018   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:44.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:44.949608   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:44.949616   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:44.949619   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:44.951624   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:44:45.450017   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:45.450038   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:45.450047   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:45.450050   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:45.452427   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:45.950207   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:45.950231   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:45.950242   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:45.950247   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:45.952442   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:46.450186   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:46.450210   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:46.450222   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:46.450229   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:46.452460   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:46.950173   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:46.950196   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:46.950205   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:46.950210   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:46.952664   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:46.952869   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:47.449354   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:47.449375   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:47.449383   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:47.449387   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:47.451831   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:47.949561   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:47.949583   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:47.949591   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:47.949596   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:47.951797   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:48.449433   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:48.449455   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:48.449462   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:48.449471   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:48.451613   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:48.949282   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:48.949303   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:48.949311   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:48.949316   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:48.951384   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:49.450163   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:49.450186   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:49.450192   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:49.450196   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:49.452485   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:49.452604   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:49.950014   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:49.950036   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:49.950044   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:49.950049   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:49.952661   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:50.449236   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:50.449261   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:50.449269   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:50.449273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:50.451559   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:50.949313   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:50.949334   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:50.949342   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:50.949346   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:50.951517   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:51.449322   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:51.449347   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:51.449358   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:51.449363   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:51.451910   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:51.949546   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:51.949587   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:51.949596   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:51.949601   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:51.951818   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:51.951915   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:52.449457   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:52.449484   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:52.449496   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:52.449503   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:52.451569   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:52.950109   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:52.950146   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:52.950154   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:52.950158   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:52.952456   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:53.450194   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:53.450215   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:53.450224   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:53.450228   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:53.452925   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:53.949603   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:53.949627   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:53.949636   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:53.949640   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:53.951992   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:53.952105   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:54.449662   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:54.449684   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:54.449692   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:54.449697   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:54.452086   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:54.949967   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:54.949992   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:54.950000   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:54.950006   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:54.952148   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:55.449585   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:55.449604   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:55.449612   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:55.449616   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:55.452219   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:55.950044   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:55.950068   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:55.950077   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:55.950083   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:55.954995   37081 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:44:55.955113   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:56.449365   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:56.449384   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:56.449393   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:56.449397   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:56.452012   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:56.949696   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:56.949718   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:56.949728   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:56.949732   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:56.951970   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:57.449647   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:57.449671   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:57.449681   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:57.449685   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:57.451813   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:57.949469   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:57.949500   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:57.949508   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:57.949513   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:57.951835   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:58.449444   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:58.449483   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:58.449491   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:58.449496   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:58.451784   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:58.451916   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:44:58.949414   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:58.949434   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:58.949442   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:58.949446   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:58.952341   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:59.449663   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:59.449684   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:59.449692   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:59.449696   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:59.451808   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:44:59.949569   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:44:59.949593   37081 round_trippers.go:469] Request Headers:
	I0717 17:44:59.949602   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:44:59.949606   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:44:59.951748   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:00.449229   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:00.449252   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:00.449261   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:00.449266   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:00.451495   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:00.950164   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:00.950187   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:00.950195   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:00.950201   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:00.952265   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:00.952373   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:45:01.449827   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:01.449849   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:01.449858   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:01.449863   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:01.452083   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:01.949772   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:01.949798   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:01.949810   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:01.949815   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:01.952012   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:02.449672   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:02.449693   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:02.449700   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:02.449704   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:02.451891   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:02.949546   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:02.949568   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:02.949576   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:02.949580   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:02.951444   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:45:03.450199   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:03.450223   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:03.450231   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:03.450235   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:03.452453   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:03.452559   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:45:03.950197   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:03.950218   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:03.950226   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:03.950230   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:03.952488   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:04.450076   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:04.450102   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:04.450112   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:04.450137   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:04.452431   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:04.950259   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:04.950282   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:04.950290   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:04.950294   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:04.952507   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:05.450157   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:05.450180   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:05.450191   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:05.450196   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:05.452443   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:05.950194   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:05.950218   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:05.950230   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:05.950238   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:05.952374   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:05.952471   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:45:06.450143   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:06.450172   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:06.450187   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:06.450193   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:06.452832   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:06.949536   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:06.949561   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:06.949573   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:06.949578   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:06.951703   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:07.449364   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:45:07.449385   37081 round_trippers.go:469] Request Headers:
	I0717 17:45:07.449393   37081 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:45:07.449397   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:45:07.452030   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:45:07.949753   37081 node_ready.go:38] duration metric: took 4m0.000631344s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:45:07.951998   37081 out.go:177] 
	W0717 17:45:07.953395   37081 out.go:239] X Exiting due to GUEST_NODE_START: failed to start node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_NODE_START: failed to start node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0717 17:45:07.953410   37081 out.go:239] * 
	* 
	W0717 17:45:07.955456   37081 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_6a758bccf1d363a5d0799efcdea444172a621e97_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_6a758bccf1d363a5d0799efcdea444172a621e97_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:45:07.957002   37081 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:422: I0717 17:40:49.743931   37081 out.go:291] Setting OutFile to fd 1 ...
I0717 17:40:49.744399   37081 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:40:49.744451   37081 out.go:304] Setting ErrFile to fd 2...
I0717 17:40:49.744468   37081 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:40:49.744961   37081 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
I0717 17:40:49.745606   37081 mustload.go:65] Loading cluster: ha-333994
I0717 17:40:49.745934   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:40:49.746306   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:40:49.746344   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:40:49.761593   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44971
I0717 17:40:49.761990   37081 main.go:141] libmachine: () Calling .GetVersion
I0717 17:40:49.762497   37081 main.go:141] libmachine: Using API Version  1
I0717 17:40:49.762518   37081 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:40:49.762807   37081 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:40:49.762989   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
W0717 17:40:49.764368   37081 host.go:58] "ha-333994-m02" host status: Stopped
I0717 17:40:49.766394   37081 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
I0717 17:40:49.767693   37081 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
I0717 17:40:49.767729   37081 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
I0717 17:40:49.767750   37081 cache.go:56] Caching tarball of preloaded images
I0717 17:40:49.767843   37081 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
I0717 17:40:49.767857   37081 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
I0717 17:40:49.768015   37081 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
I0717 17:40:49.768263   37081 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
I0717 17:40:49.768320   37081 start.go:364] duration metric: took 30.3µs to acquireMachinesLock for "ha-333994-m02"
I0717 17:40:49.768338   37081 start.go:96] Skipping create...Using existing machine configuration
I0717 17:40:49.768348   37081 fix.go:54] fixHost starting: m02
I0717 17:40:49.768726   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:40:49.768756   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:40:49.783201   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40649
I0717 17:40:49.783660   37081 main.go:141] libmachine: () Calling .GetVersion
I0717 17:40:49.784131   37081 main.go:141] libmachine: Using API Version  1
I0717 17:40:49.784152   37081 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:40:49.784425   37081 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:40:49.784622   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:40:49.784799   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
I0717 17:40:49.786164   37081 fix.go:112] recreateIfNeeded on ha-333994-m02: state=Stopped err=<nil>
I0717 17:40:49.786187   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
W0717 17:40:49.786322   37081 fix.go:138] unexpected machine state, will restart: <nil>
I0717 17:40:49.788292   37081 out.go:177] * Restarting existing kvm2 VM for "ha-333994-m02" ...
I0717 17:40:49.789383   37081 main.go:141] libmachine: (ha-333994-m02) Calling .Start
I0717 17:40:49.789531   37081 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
I0717 17:40:49.790274   37081 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
I0717 17:40:49.790623   37081 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
I0717 17:40:49.790991   37081 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
I0717 17:40:49.791681   37081 main.go:141] libmachine: (ha-333994-m02) Creating domain...
I0717 17:40:50.968744   37081 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
I0717 17:40:50.969648   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:40:50.970132   37081 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
I0717 17:40:50.970159   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:40:50.970169   37081 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
I0717 17:40:50.970557   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:40:50.970578   37081 main.go:141] libmachine: (ha-333994-m02) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"}
I0717 17:40:50.970590   37081 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
I0717 17:40:50.970614   37081 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
I0717 17:40:50.970625   37081 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
I0717 17:40:50.972500   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:40:50.972819   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:40:50.972843   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:40:50.972943   37081 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
I0717 17:40:50.973032   37081 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
I0717 17:40:50.973064   37081 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
I0717 17:40:50.973093   37081 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
I0717 17:40:50.973103   37081 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
I0717 17:41:02.101982   37081 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
I0717 17:41:02.102417   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
I0717 17:41:02.103028   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
I0717 17:41:02.105503   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.105914   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.105956   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.106221   37081 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
I0717 17:41:02.106401   37081 machine.go:94] provisionDockerMachine start ...
I0717 17:41:02.106417   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:02.106633   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.108632   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.108946   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.108970   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.109089   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:02.109246   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.109376   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.109486   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:02.109640   37081 main.go:141] libmachine: Using SSH client type: native
I0717 17:41:02.109835   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
I0717 17:41:02.109848   37081 main.go:141] libmachine: About to run SSH command:
hostname
I0717 17:41:02.210732   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube

                                                
                                                
I0717 17:41:02.210765   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
I0717 17:41:02.211004   37081 buildroot.go:166] provisioning hostname "ha-333994-m02"
I0717 17:41:02.211027   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
I0717 17:41:02.211210   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.213859   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.214231   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.214256   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.214420   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:02.214628   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.214793   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.214928   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:02.215079   37081 main.go:141] libmachine: Using SSH client type: native
I0717 17:41:02.215234   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
I0717 17:41:02.215244   37081 main.go:141] libmachine: About to run SSH command:
sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
I0717 17:41:02.329074   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02

                                                
                                                
I0717 17:41:02.329101   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.332004   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.332346   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.332374   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.332529   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:02.332708   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.332884   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.333017   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:02.333200   37081 main.go:141] libmachine: Using SSH client type: native
I0717 17:41:02.333368   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
I0717 17:41:02.333392   37081 main.go:141] libmachine: About to run SSH command:

                                                
                                                
		if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
			if grep -xq '127.0.1.1\s.*' /etc/hosts; then
				sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
			else 
				echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
			fi
		fi
I0717 17:41:02.439612   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: 
I0717 17:41:02.439645   37081 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
I0717 17:41:02.439684   37081 buildroot.go:174] setting up certificates
I0717 17:41:02.439694   37081 provision.go:84] configureAuth start
I0717 17:41:02.439712   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
I0717 17:41:02.439966   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
I0717 17:41:02.442371   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.442744   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.442778   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.442862   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.444931   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.445238   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.445271   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.445388   37081 provision.go:143] copyHostCerts
I0717 17:41:02.445424   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
I0717 17:41:02.445459   37081 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
I0717 17:41:02.445471   37081 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
I0717 17:41:02.445530   37081 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
I0717 17:41:02.445619   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
I0717 17:41:02.445642   37081 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
I0717 17:41:02.445649   37081 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
I0717 17:41:02.445679   37081 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
I0717 17:41:02.445740   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
I0717 17:41:02.445755   37081 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
I0717 17:41:02.445763   37081 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
I0717 17:41:02.445784   37081 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
I0717 17:41:02.445841   37081 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
I0717 17:41:02.589008   37081 provision.go:177] copyRemoteCerts
I0717 17:41:02.589063   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
I0717 17:41:02.589084   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.591663   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.592005   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.592035   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.592250   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:02.592429   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.592584   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:02.592695   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
I0717 17:41:02.676421   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
I0717 17:41:02.676497   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
I0717 17:41:02.702202   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
I0717 17:41:02.702280   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
I0717 17:41:02.726723   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
I0717 17:41:02.726785   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
I0717 17:41:02.750662   37081 provision.go:87] duration metric: took 310.952949ms to configureAuth
I0717 17:41:02.750686   37081 buildroot.go:189] setting minikube options for container-runtime
I0717 17:41:02.750896   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:41:02.750908   37081 machine.go:97] duration metric: took 644.496018ms to provisionDockerMachine
I0717 17:41:02.750915   37081 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
I0717 17:41:02.750927   37081 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
I0717 17:41:02.750956   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:02.751263   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
I0717 17:41:02.751291   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.753491   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.753841   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.753864   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.753996   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:02.754173   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.754338   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:02.754476   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
I0717 17:41:02.832822   37081 ssh_runner.go:195] Run: cat /etc/os-release
I0717 17:41:02.837161   37081 info.go:137] Remote host: Buildroot 2023.02.9
I0717 17:41:02.837181   37081 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
I0717 17:41:02.837256   37081 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
I0717 17:41:02.837354   37081 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
I0717 17:41:02.837368   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
I0717 17:41:02.837481   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
I0717 17:41:02.846992   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
I0717 17:41:02.870624   37081 start.go:296] duration metric: took 119.696511ms for postStartSetup
I0717 17:41:02.870658   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:02.870955   37081 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
I0717 17:41:02.870988   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:02.873305   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.873658   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:02.873695   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:02.873847   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:02.874026   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:02.874196   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:02.874348   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
I0717 17:41:02.952952   37081 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
I0717 17:41:02.953020   37081 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
I0717 17:41:03.009383   37081 fix.go:56] duration metric: took 13.241028492s for fixHost
I0717 17:41:03.009422   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:03.012020   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.012379   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:03.012404   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.012612   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:03.012790   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:03.012929   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:03.013094   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:03.013227   37081 main.go:141] libmachine: Using SSH client type: native
I0717 17:41:03.013381   37081 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
I0717 17:41:03.013389   37081 main.go:141] libmachine: About to run SSH command:
date +%!s(MISSING).%!N(MISSING)
I0717 17:41:03.114865   37081 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238063.079993674

                                                
                                                
I0717 17:41:03.114884   37081 fix.go:216] guest clock: 1721238063.079993674
I0717 17:41:03.114891   37081 fix.go:229] Guest: 2024-07-17 17:41:03.079993674 +0000 UTC Remote: 2024-07-17 17:41:03.009406179 +0000 UTC m=+13.299862053 (delta=70.587495ms)
I0717 17:41:03.114922   37081 fix.go:200] guest clock delta is within tolerance: 70.587495ms
I0717 17:41:03.114927   37081 start.go:83] releasing machines lock for "ha-333994-m02", held for 13.346596861s
I0717 17:41:03.114944   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:03.115188   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
I0717 17:41:03.117612   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.117941   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:03.117964   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.118085   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:03.118576   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:03.118728   37081 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
I0717 17:41:03.118825   37081 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
I0717 17:41:03.118860   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:03.118974   37081 ssh_runner.go:195] Run: systemctl --version
I0717 17:41:03.118994   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
I0717 17:41:03.121310   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.121527   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.121705   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:03.121729   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.121849   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:03.121867   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:03.121884   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:03.122012   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
I0717 17:41:03.122084   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:03.122191   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
I0717 17:41:03.122369   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:03.122399   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
I0717 17:41:03.122523   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
I0717 17:41:03.122564   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
I0717 17:41:03.227400   37081 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
W0717 17:41:03.233219   37081 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
I0717 17:41:03.233281   37081 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
I0717 17:41:03.249175   37081 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
I0717 17:41:03.249204   37081 start.go:495] detecting cgroup driver to use...
I0717 17:41:03.249270   37081 ssh_runner.go:195] Run: sudo systemctl stop -f crio
I0717 17:41:03.273876   37081 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
I0717 17:41:03.287441   37081 docker.go:217] disabling cri-docker service (if available) ...
I0717 17:41:03.287508   37081 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
I0717 17:41:03.302143   37081 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
I0717 17:41:03.315989   37081 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
I0717 17:41:03.429297   37081 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
I0717 17:41:03.577522   37081 docker.go:233] disabling docker service ...
I0717 17:41:03.577615   37081 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
I0717 17:41:03.592287   37081 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
I0717 17:41:03.604967   37081 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
I0717 17:41:03.745354   37081 ssh_runner.go:195] Run: sudo systemctl mask docker.service
I0717 17:41:03.870943   37081 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
I0717 17:41:03.891622   37081 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
" | sudo tee /etc/crictl.yaml"
I0717 17:41:03.910637   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
I0717 17:41:03.921410   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
I0717 17:41:03.932379   37081 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
I0717 17:41:03.932468   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
I0717 17:41:03.943824   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
I0717 17:41:03.954702   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
I0717 17:41:03.965664   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
I0717 17:41:03.976018   37081 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
I0717 17:41:03.986690   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
I0717 17:41:03.996765   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
I0717 17:41:04.007054   37081 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
I0717 17:41:04.017289   37081 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
I0717 17:41:04.026360   37081 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
stdout:

                                                
                                                
stderr:
sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
I0717 17:41:04.026417   37081 ssh_runner.go:195] Run: sudo modprobe br_netfilter
I0717 17:41:04.039290   37081 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
I0717 17:41:04.048597   37081 ssh_runner.go:195] Run: sudo systemctl daemon-reload
I0717 17:41:04.165587   37081 ssh_runner.go:195] Run: sudo systemctl restart containerd
I0717 17:41:04.194954   37081 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
I0717 17:41:04.195028   37081 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
I0717 17:41:04.199446   37081 retry.go:31] will retry after 921.252694ms: stat /run/containerd/containerd.sock: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
I0717 17:41:05.121509   37081 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
I0717 17:41:05.127000   37081 start.go:563] Will wait 60s for crictl version
I0717 17:41:05.127053   37081 ssh_runner.go:195] Run: which crictl
I0717 17:41:05.130790   37081 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
I0717 17:41:05.171609   37081 start.go:579] Version:  0.1.0
RuntimeName:  containerd
RuntimeVersion:  v1.7.19
RuntimeApiVersion:  v1
I0717 17:41:05.171678   37081 ssh_runner.go:195] Run: containerd --version
I0717 17:41:05.199712   37081 ssh_runner.go:195] Run: containerd --version
I0717 17:41:05.227272   37081 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
I0717 17:41:05.228800   37081 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
I0717 17:41:05.231335   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:05.231812   37081 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
I0717 17:41:05.231840   37081 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
I0717 17:41:05.232000   37081 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
I0717 17:41:05.236169   37081 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
I0717 17:41:05.248698   37081 mustload.go:65] Loading cluster: ha-333994
I0717 17:41:05.248959   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:41:05.249245   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:41:05.249280   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:41:05.264378   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38343
I0717 17:41:05.264857   37081 main.go:141] libmachine: () Calling .GetVersion
I0717 17:41:05.265306   37081 main.go:141] libmachine: Using API Version  1
I0717 17:41:05.265326   37081 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:41:05.265616   37081 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:41:05.265794   37081 main.go:141] libmachine: (ha-333994) Calling .GetState
I0717 17:41:05.267403   37081 host.go:66] Checking if "ha-333994" exists ...
I0717 17:41:05.267735   37081 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:41:05.267774   37081 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:41:05.281710   37081 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39581
I0717 17:41:05.282146   37081 main.go:141] libmachine: () Calling .GetVersion
I0717 17:41:05.282650   37081 main.go:141] libmachine: Using API Version  1
I0717 17:41:05.282670   37081 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:41:05.282954   37081 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:41:05.283107   37081 main.go:141] libmachine: (ha-333994) Calling .DriverName
I0717 17:41:05.283258   37081 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
I0717 17:41:05.283270   37081 certs.go:194] generating shared ca certs ...
I0717 17:41:05.283286   37081 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
I0717 17:41:05.283466   37081 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
I0717 17:41:05.283521   37081 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
I0717 17:41:05.283533   37081 certs.go:256] generating profile certs ...
I0717 17:41:05.283647   37081 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
I0717 17:41:05.283718   37081 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
I0717 17:41:05.283774   37081 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
I0717 17:41:05.283788   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
I0717 17:41:05.283804   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
I0717 17:41:05.283824   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
I0717 17:41:05.283839   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
I0717 17:41:05.283855   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
I0717 17:41:05.283872   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
I0717 17:41:05.283888   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
I0717 17:41:05.283904   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
I0717 17:41:05.283955   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
W0717 17:41:05.283998   37081 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
I0717 17:41:05.284008   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
I0717 17:41:05.284044   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
I0717 17:41:05.284072   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
I0717 17:41:05.284096   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
I0717 17:41:05.284160   37081 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
I0717 17:41:05.284195   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
I0717 17:41:05.284218   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
I0717 17:41:05.284235   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
I0717 17:41:05.284263   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
I0717 17:41:05.286976   37081 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
I0717 17:41:05.287365   37081 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
I0717 17:41:05.287388   37081 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
I0717 17:41:05.287485   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
I0717 17:41:05.287637   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
I0717 17:41:05.287766   37081 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
I0717 17:41:05.287887   37081 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
I0717 17:41:05.366513   37081 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
I0717 17:41:05.371830   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
I0717 17:41:05.384549   37081 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
I0717 17:41:05.388961   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
I0717 17:41:05.399728   37081 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
I0717 17:41:05.404176   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
I0717 17:41:05.416124   37081 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
I0717 17:41:05.420804   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
I0717 17:41:05.435197   37081 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
I0717 17:41:05.440207   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
I0717 17:41:05.453379   37081 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
I0717 17:41:05.458855   37081 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
I0717 17:41:05.471182   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
I0717 17:41:05.500866   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
I0717 17:41:05.526100   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
I0717 17:41:05.550463   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
I0717 17:41:05.575524   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
I0717 17:41:05.600591   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
I0717 17:41:05.627676   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
I0717 17:41:05.651870   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
I0717 17:41:05.677550   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
I0717 17:41:05.701832   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
I0717 17:41:05.725559   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
I0717 17:41:05.754176   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
I0717 17:41:05.770797   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
I0717 17:41:05.787548   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
I0717 17:41:05.804053   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
I0717 17:41:05.820440   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
I0717 17:41:05.837358   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
I0717 17:41:05.854104   37081 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
I0717 17:41:05.871924   37081 ssh_runner.go:195] Run: openssl version
I0717 17:41:05.877834   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
I0717 17:41:05.888389   37081 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
I0717 17:41:05.892833   37081 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
I0717 17:41:05.892889   37081 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
I0717 17:41:05.898728   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
I0717 17:41:05.909438   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
I0717 17:41:05.919919   37081 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
I0717 17:41:05.924346   37081 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
I0717 17:41:05.924420   37081 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
I0717 17:41:05.929890   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
I0717 17:41:05.940442   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
I0717 17:41:05.951137   37081 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
I0717 17:41:05.955755   37081 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
I0717 17:41:05.955823   37081 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
I0717 17:41:05.961522   37081 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
I0717 17:41:05.973007   37081 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
I0717 17:41:05.977019   37081 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
I0717 17:41:05.977078   37081 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
I0717 17:41:05.977185   37081 kubeadm.go:946] kubelet [Unit]
Wants=containerd.service

                                                
                                                
[Service]
ExecStart=
ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127

                                                
                                                
[Install]
config:
{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
I0717 17:41:05.977223   37081 kube-vip.go:115] generating kube-vip config ...
I0717 17:41:05.977266   37081 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
I0717 17:41:05.993693   37081 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
I0717 17:41:05.993815   37081 kube-vip.go:137] kube-vip config:
apiVersion: v1
kind: Pod
metadata:
creationTimestamp: null
name: kube-vip
namespace: kube-system
spec:
containers:
- args:
- manager
env:
- name: vip_arp
value: "true"
- name: port
value: "8443"
- name: vip_nodename
valueFrom:
fieldRef:
fieldPath: spec.nodeName
- name: vip_interface
value: eth0
- name: vip_cidr
value: "32"
- name: dns_mode
value: first
- name: cp_enable
value: "true"
- name: cp_namespace
value: kube-system
- name: vip_leaderelection
value: "true"
- name: vip_leasename
value: plndr-cp-lock
- name: vip_leaseduration
value: "5"
- name: vip_renewdeadline
value: "3"
- name: vip_retryperiod
value: "1"
- name: address
value: 192.168.39.254
- name: prometheus_server
value: :2112
- name : lb_enable
value: "true"
- name: lb_port
value: "8443"
image: ghcr.io/kube-vip/kube-vip:v0.8.0
imagePullPolicy: IfNotPresent
name: kube-vip
resources: {}
securityContext:
capabilities:
add:
- NET_ADMIN
- NET_RAW
volumeMounts:
- mountPath: /etc/kubernetes/admin.conf
name: kubeconfig
hostAliases:
- hostnames:
- kubernetes
ip: 127.0.0.1
hostNetwork: true
volumes:
- hostPath:
path: "/etc/kubernetes/admin.conf"
name: kubeconfig
status: {}
I0717 17:41:05.993890   37081 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
I0717 17:41:06.004772   37081 binaries.go:47] Didn't find k8s binaries: didn't find preexisting kubelet
Initiating transfer...
I0717 17:41:06.004860   37081 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
I0717 17:41:06.015193   37081 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256
I0717 17:41:06.015201   37081 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256
I0717 17:41:06.015224   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm -> /var/lib/minikube/binaries/v1.30.2/kubeadm
I0717 17:41:06.015241   37081 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
I0717 17:41:06.015201   37081 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
I0717 17:41:06.015307   37081 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubeadm
I0717 17:41:06.015314   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
I0717 17:41:06.015377   37081 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
I0717 17:41:06.030872   37081 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubeadm: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubeadm: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubeadm': No such file or directory
I0717 17:41:06.030902   37081 ssh_runner.go:356] copy: skipping /var/lib/minikube/binaries/v1.30.2/kubectl (exists)
I0717 17:41:06.030872   37081 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet -> /var/lib/minikube/binaries/v1.30.2/kubelet
I0717 17:41:06.030918   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm --> /var/lib/minikube/binaries/v1.30.2/kubeadm (50249880 bytes)
I0717 17:41:06.031007   37081 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubelet
I0717 17:41:06.053246   37081 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubelet: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubelet: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubelet': No such file or directory
I0717 17:41:06.053293   37081 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet --> /var/lib/minikube/binaries/v1.30.2/kubelet (100124920 bytes)
I0717 17:41:06.744852   37081 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
I0717 17:41:06.754277   37081 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
I0717 17:41:06.770802   37081 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
I0717 17:41:06.787285   37081 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
I0717 17:41:06.803568   37081 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
I0717 17:41:06.807750   37081 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
I0717 17:41:06.819602   37081 ssh_runner.go:195] Run: sudo systemctl daemon-reload
I0717 17:41:06.926111   37081 ssh_runner.go:195] Run: sudo systemctl start kubelet
I0717 17:41:06.948115   37081 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
I0717 17:41:06.948208   37081 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
I0717 17:41:06.948327   37081 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:41:06.950375   37081 out.go:177] * Enabled addons: 
I0717 17:41:06.950381   37081 out.go:177] * Verifying Kubernetes components...
I0717 17:41:06.951865   37081 addons.go:510] duration metric: took 3.670088ms for enable addons: enabled=[]
I0717 17:41:06.951955   37081 ssh_runner.go:195] Run: sudo systemctl daemon-reload
I0717 17:41:07.093263   37081 ssh_runner.go:195] Run: sudo systemctl start kubelet
I0717 17:41:07.948216   37081 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
I0717 17:41:07.948479   37081 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}
, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
W0717 17:41:07.948551   37081 kubeadm.go:483] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.180:8443
I0717 17:41:07.948937   37081 cert_rotation.go:137] Starting client certificate rotation controller
I0717 17:41:07.949083   37081 node_ready.go:35] waiting up to 6m0s for node "ha-333994-m02" to be "Ready" ...
I0717 17:41:07.949172   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:07.949182   37081 round_trippers.go:469] Request Headers:
I0717 17:41:07.949192   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:07.949196   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:07.957921   37081 round_trippers.go:574] Response Status: 404 Not Found in 8 milliseconds
I0717 17:41:08.449574   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:08.449604   37081 round_trippers.go:469] Request Headers:
I0717 17:41:08.449615   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:08.449620   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:08.451972   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:08.949592   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:08.949615   37081 round_trippers.go:469] Request Headers:
I0717 17:41:08.949623   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:08.949627   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:08.952028   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:09.449641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:09.449668   37081 round_trippers.go:469] Request Headers:
I0717 17:41:09.449678   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:09.449714   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:09.452245   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:09.949258   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:09.949281   37081 round_trippers.go:469] Request Headers:
I0717 17:41:09.949289   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:09.949295   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:09.951605   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:09.951703   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:10.450273   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:10.450296   37081 round_trippers.go:469] Request Headers:
I0717 17:41:10.450307   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:10.450311   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:10.452603   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:10.949271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:10.949293   37081 round_trippers.go:469] Request Headers:
I0717 17:41:10.949306   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:10.949310   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:10.951728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:11.449494   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:11.449520   37081 round_trippers.go:469] Request Headers:
I0717 17:41:11.449528   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:11.449532   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:11.451705   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:11.949403   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:11.949425   37081 round_trippers.go:469] Request Headers:
I0717 17:41:11.949433   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:11.949437   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:11.951816   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:11.951901   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:12.449412   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:12.449441   37081 round_trippers.go:469] Request Headers:
I0717 17:41:12.449452   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:12.449458   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:12.451824   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:12.949334   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:12.949356   37081 round_trippers.go:469] Request Headers:
I0717 17:41:12.949363   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:12.949368   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:12.951504   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:13.450291   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:13.450319   37081 round_trippers.go:469] Request Headers:
I0717 17:41:13.450329   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:13.450334   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:13.452660   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:13.949312   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:13.949332   37081 round_trippers.go:469] Request Headers:
I0717 17:41:13.949339   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:13.949343   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:13.951644   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:14.449301   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:14.449326   37081 round_trippers.go:469] Request Headers:
I0717 17:41:14.449331   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:14.449335   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:14.451960   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:14.452082   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:14.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:14.949609   37081 round_trippers.go:469] Request Headers:
I0717 17:41:14.949615   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:14.949621   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:14.952057   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:15.449985   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:15.450015   37081 round_trippers.go:469] Request Headers:
I0717 17:41:15.450041   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:15.450046   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:15.452325   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:15.950081   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:15.950109   37081 round_trippers.go:469] Request Headers:
I0717 17:41:15.950135   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:15.950141   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:15.952228   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:16.449951   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:16.449972   37081 round_trippers.go:469] Request Headers:
I0717 17:41:16.449978   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:16.449981   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:16.455720   37081 round_trippers.go:574] Response Status: 404 Not Found in 5 milliseconds
I0717 17:41:16.455837   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:16.949409   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:16.949431   37081 round_trippers.go:469] Request Headers:
I0717 17:41:16.949440   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:16.949447   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:16.951499   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:17.450281   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:17.450311   37081 round_trippers.go:469] Request Headers:
I0717 17:41:17.450322   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:17.450327   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:17.452453   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:17.950231   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:17.950250   37081 round_trippers.go:469] Request Headers:
I0717 17:41:17.950260   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:17.950286   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:17.952495   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:18.450227   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:18.450250   37081 round_trippers.go:469] Request Headers:
I0717 17:41:18.450261   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:18.450267   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:18.452571   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:18.949247   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:18.949272   37081 round_trippers.go:469] Request Headers:
I0717 17:41:18.949281   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:18.949285   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:18.951540   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:18.951643   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:19.450258   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:19.450281   37081 round_trippers.go:469] Request Headers:
I0717 17:41:19.450288   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:19.450293   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:19.452577   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:19.949564   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:19.949586   37081 round_trippers.go:469] Request Headers:
I0717 17:41:19.949594   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:19.949599   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:19.952593   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:20.450276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:20.450301   37081 round_trippers.go:469] Request Headers:
I0717 17:41:20.450312   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:20.450319   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:20.452399   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:20.950152   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:20.950174   37081 round_trippers.go:469] Request Headers:
I0717 17:41:20.950183   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:20.950188   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:20.952469   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:20.952566   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:21.450205   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:21.450266   37081 round_trippers.go:469] Request Headers:
I0717 17:41:21.450275   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:21.450278   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:21.452554   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:21.950280   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:21.950303   37081 round_trippers.go:469] Request Headers:
I0717 17:41:21.950311   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:21.950315   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:21.952523   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:22.450262   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:22.450285   37081 round_trippers.go:469] Request Headers:
I0717 17:41:22.450293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:22.450297   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:22.452393   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:22.949997   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:22.950024   37081 round_trippers.go:469] Request Headers:
I0717 17:41:22.950034   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:22.950040   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:22.952355   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:23.450087   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:23.450110   37081 round_trippers.go:469] Request Headers:
I0717 17:41:23.450131   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:23.450134   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:23.452519   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:23.452597   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:23.950271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:23.950300   37081 round_trippers.go:469] Request Headers:
I0717 17:41:23.950308   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:23.950312   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:23.952596   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:24.450135   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:24.450161   37081 round_trippers.go:469] Request Headers:
I0717 17:41:24.450173   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:24.450181   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:24.452581   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:24.949255   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:24.949277   37081 round_trippers.go:469] Request Headers:
I0717 17:41:24.949284   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:24.949287   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:24.951637   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:25.450074   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:25.450096   37081 round_trippers.go:469] Request Headers:
I0717 17:41:25.450103   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:25.450107   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:25.452622   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:25.452710   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:25.949273   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:25.949295   37081 round_trippers.go:469] Request Headers:
I0717 17:41:25.949303   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:25.949307   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:25.951790   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:26.449458   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:26.449481   37081 round_trippers.go:469] Request Headers:
I0717 17:41:26.449489   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:26.449492   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:26.451617   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:26.949266   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:26.949287   37081 round_trippers.go:469] Request Headers:
I0717 17:41:26.949295   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:26.949298   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:26.951455   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:27.450199   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:27.450220   37081 round_trippers.go:469] Request Headers:
I0717 17:41:27.450227   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:27.450232   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:27.452554   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:27.950193   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:27.950213   37081 round_trippers.go:469] Request Headers:
I0717 17:41:27.950221   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:27.950226   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:27.952490   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:27.952609   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:28.449743   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:28.449765   37081 round_trippers.go:469] Request Headers:
I0717 17:41:28.449772   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:28.449775   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:28.452096   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:28.949804   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:28.949831   37081 round_trippers.go:469] Request Headers:
I0717 17:41:28.949841   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:28.949847   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:28.952340   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:29.450126   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:29.450151   37081 round_trippers.go:469] Request Headers:
I0717 17:41:29.450162   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:29.450165   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:29.452229   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:29.950141   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:29.950166   37081 round_trippers.go:469] Request Headers:
I0717 17:41:29.950178   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:29.950184   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:29.952611   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:29.952728   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:30.449247   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:30.449268   37081 round_trippers.go:469] Request Headers:
I0717 17:41:30.449275   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:30.449279   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:30.451962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:30.950279   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:30.950317   37081 round_trippers.go:469] Request Headers:
I0717 17:41:30.950326   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:30.950331   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:30.952449   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:31.450261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:31.450286   37081 round_trippers.go:469] Request Headers:
I0717 17:41:31.450294   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:31.450300   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:31.452636   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:31.950035   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:31.950063   37081 round_trippers.go:469] Request Headers:
I0717 17:41:31.950074   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:31.950082   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:31.952680   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:31.952798   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:32.449412   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:32.449434   37081 round_trippers.go:469] Request Headers:
I0717 17:41:32.449441   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:32.449445   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:32.451648   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:32.949364   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:32.949386   37081 round_trippers.go:469] Request Headers:
I0717 17:41:32.949394   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:32.949398   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:32.951662   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:33.449339   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:33.449358   37081 round_trippers.go:469] Request Headers:
I0717 17:41:33.449364   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:33.449369   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:33.452011   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:33.949709   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:33.949730   37081 round_trippers.go:469] Request Headers:
I0717 17:41:33.949738   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:33.949742   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:33.951940   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:34.449328   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:34.449368   37081 round_trippers.go:469] Request Headers:
I0717 17:41:34.449377   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:34.449381   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:34.451572   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:34.451679   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:34.949348   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:34.949371   37081 round_trippers.go:469] Request Headers:
I0717 17:41:34.949379   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:34.949382   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:34.951712   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:35.449318   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:35.449343   37081 round_trippers.go:469] Request Headers:
I0717 17:41:35.449354   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:35.449359   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:35.451749   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:35.949457   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:35.949480   37081 round_trippers.go:469] Request Headers:
I0717 17:41:35.949486   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:35.949492   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:35.952154   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:36.449877   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:36.449904   37081 round_trippers.go:469] Request Headers:
I0717 17:41:36.449915   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:36.449920   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:36.452639   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:36.452763   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:36.949281   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:36.949305   37081 round_trippers.go:469] Request Headers:
I0717 17:41:36.949313   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:36.949316   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:36.951622   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:37.449403   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:37.449428   37081 round_trippers.go:469] Request Headers:
I0717 17:41:37.449440   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:37.449445   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:37.453253   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:41:37.950047   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:37.950075   37081 round_trippers.go:469] Request Headers:
I0717 17:41:37.950086   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:37.950090   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:37.952540   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:38.450296   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:38.450318   37081 round_trippers.go:469] Request Headers:
I0717 17:41:38.450326   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:38.450329   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:38.452789   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:38.452894   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:38.949454   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:38.949478   37081 round_trippers.go:469] Request Headers:
I0717 17:41:38.949489   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:38.949497   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:38.951996   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:39.449448   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:39.449498   37081 round_trippers.go:469] Request Headers:
I0717 17:41:39.449510   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:39.449538   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:39.452233   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:39.950110   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:39.950146   37081 round_trippers.go:469] Request Headers:
I0717 17:41:39.950157   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:39.950165   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:39.952408   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:40.450013   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:40.450036   37081 round_trippers.go:469] Request Headers:
I0717 17:41:40.450044   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:40.450047   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:40.452572   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:40.949272   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:40.949296   37081 round_trippers.go:469] Request Headers:
I0717 17:41:40.949304   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:40.949308   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:40.951587   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:40.951701   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:41.449270   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:41.449293   37081 round_trippers.go:469] Request Headers:
I0717 17:41:41.449300   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:41.449306   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:41.451720   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:41.949374   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:41.949418   37081 round_trippers.go:469] Request Headers:
I0717 17:41:41.949431   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:41.949437   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:41.951663   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:42.449983   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:42.450010   37081 round_trippers.go:469] Request Headers:
I0717 17:41:42.450022   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:42.450029   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:42.452815   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:42.949535   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:42.949558   37081 round_trippers.go:469] Request Headers:
I0717 17:41:42.949569   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:42.949577   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:42.951867   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:42.952082   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:43.449639   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:43.449661   37081 round_trippers.go:469] Request Headers:
I0717 17:41:43.449667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:43.449671   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:43.452781   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:41:43.949332   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:43.949355   37081 round_trippers.go:469] Request Headers:
I0717 17:41:43.949362   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:43.949366   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:43.951818   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:44.449441   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:44.449480   37081 round_trippers.go:469] Request Headers:
I0717 17:41:44.449488   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:44.449492   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:44.451747   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:44.949229   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:44.949252   37081 round_trippers.go:469] Request Headers:
I0717 17:41:44.949263   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:44.949267   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:44.951587   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:45.450308   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:45.450337   37081 round_trippers.go:469] Request Headers:
I0717 17:41:45.450348   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:45.450365   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:45.453102   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:45.453209   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:45.949832   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:45.949854   37081 round_trippers.go:469] Request Headers:
I0717 17:41:45.949861   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:45.949865   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:45.952032   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:46.449696   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:46.449718   37081 round_trippers.go:469] Request Headers:
I0717 17:41:46.449726   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:46.449739   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:46.451961   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:46.949634   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:46.949659   37081 round_trippers.go:469] Request Headers:
I0717 17:41:46.949667   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:46.949672   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:46.952207   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:47.449968   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:47.449993   37081 round_trippers.go:469] Request Headers:
I0717 17:41:47.450000   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:47.450004   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:47.452167   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:47.949915   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:47.949937   37081 round_trippers.go:469] Request Headers:
I0717 17:41:47.949945   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:47.949950   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:47.952143   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:47.952245   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:48.449880   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:48.449901   37081 round_trippers.go:469] Request Headers:
I0717 17:41:48.449909   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:48.449914   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:48.452275   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:48.950008   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:48.950029   37081 round_trippers.go:469] Request Headers:
I0717 17:41:48.950036   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:48.950040   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:48.952295   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:49.449996   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:49.450017   37081 round_trippers.go:469] Request Headers:
I0717 17:41:49.450026   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:49.450029   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:49.453283   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:41:49.949327   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:49.949352   37081 round_trippers.go:469] Request Headers:
I0717 17:41:49.949363   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:49.949368   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:49.951627   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:50.449265   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:50.449285   37081 round_trippers.go:469] Request Headers:
I0717 17:41:50.449293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:50.449297   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:50.451728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:50.451845   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:50.949409   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:50.949433   37081 round_trippers.go:469] Request Headers:
I0717 17:41:50.949442   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:50.949445   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:50.951888   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:51.449551   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:51.449574   37081 round_trippers.go:469] Request Headers:
I0717 17:41:51.449581   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:51.449584   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:51.452113   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:51.949853   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:51.949874   37081 round_trippers.go:469] Request Headers:
I0717 17:41:51.949882   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:51.949886   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:51.952042   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:52.449276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:52.449297   37081 round_trippers.go:469] Request Headers:
I0717 17:41:52.449308   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:52.449312   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:52.451308   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:41:52.950028   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:52.950050   37081 round_trippers.go:469] Request Headers:
I0717 17:41:52.950057   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:52.950061   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:52.952425   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:52.952547   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:53.450260   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:53.450290   37081 round_trippers.go:469] Request Headers:
I0717 17:41:53.450299   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:53.450305   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:53.453819   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:41:53.949661   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:53.949689   37081 round_trippers.go:469] Request Headers:
I0717 17:41:53.949709   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:53.949719   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:53.952584   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:54.449284   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:54.449307   37081 round_trippers.go:469] Request Headers:
I0717 17:41:54.449317   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:54.449322   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:54.451861   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:54.949602   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:54.949627   37081 round_trippers.go:469] Request Headers:
I0717 17:41:54.949639   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:54.949646   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:54.951952   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:55.449628   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:55.449650   37081 round_trippers.go:469] Request Headers:
I0717 17:41:55.449659   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:55.449662   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:55.452239   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:55.452470   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:55.950020   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:55.950042   37081 round_trippers.go:469] Request Headers:
I0717 17:41:55.950049   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:55.950053   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:55.952875   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:56.449458   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:56.449499   37081 round_trippers.go:469] Request Headers:
I0717 17:41:56.449510   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:56.449515   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:56.452058   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:56.949754   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:56.949830   37081 round_trippers.go:469] Request Headers:
I0717 17:41:56.949847   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:56.949855   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:56.952745   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:57.449390   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:57.449415   37081 round_trippers.go:469] Request Headers:
I0717 17:41:57.449427   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:57.449431   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:57.451956   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:57.949687   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:57.949709   37081 round_trippers.go:469] Request Headers:
I0717 17:41:57.949716   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:57.949719   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:57.951934   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:57.952044   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:41:58.449609   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:58.449631   37081 round_trippers.go:469] Request Headers:
I0717 17:41:58.449638   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:58.449642   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:58.452007   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:58.949259   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:58.949281   37081 round_trippers.go:469] Request Headers:
I0717 17:41:58.949289   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:58.949295   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:58.952116   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:59.449714   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:59.449735   37081 round_trippers.go:469] Request Headers:
I0717 17:41:59.449743   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:59.449747   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:59.452490   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:41:59.949340   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:41:59.949362   37081 round_trippers.go:469] Request Headers:
I0717 17:41:59.949370   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:41:59.949373   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:41:59.951421   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:00.450035   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:00.450057   37081 round_trippers.go:469] Request Headers:
I0717 17:42:00.450065   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:00.450069   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:00.452296   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:00.452414   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:00.950012   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:00.950035   37081 round_trippers.go:469] Request Headers:
I0717 17:42:00.950045   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:00.950051   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:00.952350   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:01.450104   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:01.450149   37081 round_trippers.go:469] Request Headers:
I0717 17:42:01.450160   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:01.450166   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:01.453050   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:01.949395   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:01.949416   37081 round_trippers.go:469] Request Headers:
I0717 17:42:01.949424   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:01.949427   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:01.951756   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:02.449445   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:02.449466   37081 round_trippers.go:469] Request Headers:
I0717 17:42:02.449474   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:02.449477   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:02.452034   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:02.949715   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:02.949743   37081 round_trippers.go:469] Request Headers:
I0717 17:42:02.949751   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:02.949755   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:02.952023   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:02.952116   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:03.449464   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:03.449484   37081 round_trippers.go:469] Request Headers:
I0717 17:42:03.449492   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:03.449498   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:03.452501   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:03.949243   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:03.949265   37081 round_trippers.go:469] Request Headers:
I0717 17:42:03.949273   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:03.949275   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:03.951456   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:04.450219   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:04.450241   37081 round_trippers.go:469] Request Headers:
I0717 17:42:04.450250   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:04.450253   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:04.452663   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:04.949323   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:04.949345   37081 round_trippers.go:469] Request Headers:
I0717 17:42:04.949353   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:04.949358   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:04.952407   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:42:04.952512   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:05.450134   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:05.450159   37081 round_trippers.go:469] Request Headers:
I0717 17:42:05.450170   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:05.450176   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:05.452421   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:05.950189   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:05.950212   37081 round_trippers.go:469] Request Headers:
I0717 17:42:05.950223   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:05.950230   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:05.952631   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:06.449284   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:06.449307   37081 round_trippers.go:469] Request Headers:
I0717 17:42:06.449315   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:06.449319   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:06.453013   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:42:06.949776   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:06.949800   37081 round_trippers.go:469] Request Headers:
I0717 17:42:06.949812   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:06.949817   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:06.952185   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:07.449923   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:07.449945   37081 round_trippers.go:469] Request Headers:
I0717 17:42:07.449952   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:07.449956   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:07.452200   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:07.452301   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:07.950008   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:07.950036   37081 round_trippers.go:469] Request Headers:
I0717 17:42:07.950045   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:07.950050   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:07.952799   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:08.449430   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:08.449454   37081 round_trippers.go:469] Request Headers:
I0717 17:42:08.449461   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:08.449466   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:08.451657   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:08.949300   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:08.949323   37081 round_trippers.go:469] Request Headers:
I0717 17:42:08.949330   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:08.949333   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:08.951568   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:09.450264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:09.450288   37081 round_trippers.go:469] Request Headers:
I0717 17:42:09.450295   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:09.450299   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:09.452971   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:09.453075   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:09.949976   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:09.949998   37081 round_trippers.go:469] Request Headers:
I0717 17:42:09.950005   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:09.950025   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:09.952330   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:10.449997   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:10.450017   37081 round_trippers.go:469] Request Headers:
I0717 17:42:10.450025   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:10.450029   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:10.452873   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:10.949498   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:10.949521   37081 round_trippers.go:469] Request Headers:
I0717 17:42:10.949531   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:10.949536   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:10.952069   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:11.449766   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:11.449787   37081 round_trippers.go:469] Request Headers:
I0717 17:42:11.449796   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:11.449800   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:11.452274   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:11.949972   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:11.949996   37081 round_trippers.go:469] Request Headers:
I0717 17:42:11.950006   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:11.950012   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:11.952316   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:11.952430   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:12.450062   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:12.450094   37081 round_trippers.go:469] Request Headers:
I0717 17:42:12.450102   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:12.450107   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:12.452404   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:12.950102   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:12.950134   37081 round_trippers.go:469] Request Headers:
I0717 17:42:12.950144   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:12.950151   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:12.952144   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:42:13.449891   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:13.449913   37081 round_trippers.go:469] Request Headers:
I0717 17:42:13.449924   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:13.449929   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:13.452466   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:13.950255   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:13.950279   37081 round_trippers.go:469] Request Headers:
I0717 17:42:13.950289   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:13.950293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:13.952781   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:13.952879   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:14.449447   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:14.449469   37081 round_trippers.go:469] Request Headers:
I0717 17:42:14.449477   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:14.449481   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:14.451992   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:14.949728   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:14.949753   37081 round_trippers.go:469] Request Headers:
I0717 17:42:14.949763   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:14.949768   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:14.952599   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:15.450266   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:15.450288   37081 round_trippers.go:469] Request Headers:
I0717 17:42:15.450299   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:15.450304   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:15.453204   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:15.949940   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:15.949962   37081 round_trippers.go:469] Request Headers:
I0717 17:42:15.949970   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:15.949973   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:15.952339   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:16.450085   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:16.450108   37081 round_trippers.go:469] Request Headers:
I0717 17:42:16.450144   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:16.450151   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:16.452619   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:16.452762   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:16.949271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:16.949294   37081 round_trippers.go:469] Request Headers:
I0717 17:42:16.949318   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:16.949324   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:16.951767   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:17.449443   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:17.449465   37081 round_trippers.go:469] Request Headers:
I0717 17:42:17.449473   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:17.449478   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:17.451706   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:17.949361   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:17.949383   37081 round_trippers.go:469] Request Headers:
I0717 17:42:17.949391   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:17.949396   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:17.951886   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:18.449542   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:18.449566   37081 round_trippers.go:469] Request Headers:
I0717 17:42:18.449577   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:18.449583   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:18.451912   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:18.949553   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:18.949586   37081 round_trippers.go:469] Request Headers:
I0717 17:42:18.949596   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:18.949600   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:18.951917   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:18.952028   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:19.449581   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:19.449605   37081 round_trippers.go:469] Request Headers:
I0717 17:42:19.449616   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:19.449622   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:19.452330   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:19.949407   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:19.949431   37081 round_trippers.go:469] Request Headers:
I0717 17:42:19.949439   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:19.949447   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:19.951804   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:20.449423   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:20.449446   37081 round_trippers.go:469] Request Headers:
I0717 17:42:20.449454   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:20.449458   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:20.451926   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:20.949567   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:20.949592   37081 round_trippers.go:469] Request Headers:
I0717 17:42:20.949600   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:20.949604   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:20.951938   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:20.952072   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:21.449618   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:21.449639   37081 round_trippers.go:469] Request Headers:
I0717 17:42:21.449647   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:21.449651   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:21.451980   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:21.949635   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:21.949665   37081 round_trippers.go:469] Request Headers:
I0717 17:42:21.949676   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:21.949680   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:21.952119   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:22.449813   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:22.449834   37081 round_trippers.go:469] Request Headers:
I0717 17:42:22.449842   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:22.449845   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:22.452383   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:22.950149   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:22.950174   37081 round_trippers.go:469] Request Headers:
I0717 17:42:22.950183   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:22.950186   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:22.952425   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:22.952558   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:23.450160   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:23.450181   37081 round_trippers.go:469] Request Headers:
I0717 17:42:23.450205   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:23.450210   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:23.452807   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:23.949447   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:23.949468   37081 round_trippers.go:469] Request Headers:
I0717 17:42:23.949476   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:23.949481   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:23.951694   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:24.449386   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:24.449410   37081 round_trippers.go:469] Request Headers:
I0717 17:42:24.449417   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:24.449422   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:24.451935   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:24.950064   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:24.950085   37081 round_trippers.go:469] Request Headers:
I0717 17:42:24.950093   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:24.950096   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:24.952825   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:24.952949   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:25.449357   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:25.449379   37081 round_trippers.go:469] Request Headers:
I0717 17:42:25.449387   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:25.449391   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:25.451640   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:25.949325   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:25.949350   37081 round_trippers.go:469] Request Headers:
I0717 17:42:25.949362   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:25.949369   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:25.951688   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:26.449332   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:26.449355   37081 round_trippers.go:469] Request Headers:
I0717 17:42:26.449365   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:26.449392   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:26.452355   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:26.950074   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:26.950099   37081 round_trippers.go:469] Request Headers:
I0717 17:42:26.950109   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:26.950134   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:26.953041   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:26.953166   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:27.449261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:27.449285   37081 round_trippers.go:469] Request Headers:
I0717 17:42:27.449293   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:27.449296   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:27.451952   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:27.949732   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:27.949751   37081 round_trippers.go:469] Request Headers:
I0717 17:42:27.949759   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:27.949763   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:27.952114   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:28.450003   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:28.450025   37081 round_trippers.go:469] Request Headers:
I0717 17:42:28.450049   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:28.450053   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:28.452455   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:28.950210   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:28.950232   37081 round_trippers.go:469] Request Headers:
I0717 17:42:28.950240   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:28.950244   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:28.952859   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:29.449739   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:29.449763   37081 round_trippers.go:469] Request Headers:
I0717 17:42:29.449773   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:29.449777   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:29.451965   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:29.452105   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:29.949898   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:29.949917   37081 round_trippers.go:469] Request Headers:
I0717 17:42:29.949924   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:29.949928   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:29.952501   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:30.450202   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:30.450222   37081 round_trippers.go:469] Request Headers:
I0717 17:42:30.450230   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:30.450235   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:30.452716   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:30.949329   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:30.949351   37081 round_trippers.go:469] Request Headers:
I0717 17:42:30.949359   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:30.949362   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:30.951464   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:31.450239   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:31.450262   37081 round_trippers.go:469] Request Headers:
I0717 17:42:31.450270   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:31.450274   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:31.452542   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:31.452672   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:31.950261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:31.950281   37081 round_trippers.go:469] Request Headers:
I0717 17:42:31.950289   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:31.950293   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:31.952463   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:32.450189   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:32.450212   37081 round_trippers.go:469] Request Headers:
I0717 17:42:32.450219   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:32.450222   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:32.452565   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:32.949233   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:32.949264   37081 round_trippers.go:469] Request Headers:
I0717 17:42:32.949272   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:32.949276   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:32.951522   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:33.450243   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:33.450266   37081 round_trippers.go:469] Request Headers:
I0717 17:42:33.450275   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:33.450277   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:33.452546   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:33.950297   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:33.950320   37081 round_trippers.go:469] Request Headers:
I0717 17:42:33.950328   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:33.950331   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:33.952050   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:42:33.952167   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:34.449755   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:34.449777   37081 round_trippers.go:469] Request Headers:
I0717 17:42:34.449784   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:34.449788   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:34.452030   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:34.949974   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:34.950000   37081 round_trippers.go:469] Request Headers:
I0717 17:42:34.950009   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:34.950024   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:34.952958   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:35.449385   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:35.449409   37081 round_trippers.go:469] Request Headers:
I0717 17:42:35.449419   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:35.449424   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:35.452061   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:35.949709   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:35.949729   37081 round_trippers.go:469] Request Headers:
I0717 17:42:35.949743   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:35.949747   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:35.952326   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:35.952431   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:36.449818   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:36.449850   37081 round_trippers.go:469] Request Headers:
I0717 17:42:36.449862   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:36.449867   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:36.452678   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:36.949390   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:36.949415   37081 round_trippers.go:469] Request Headers:
I0717 17:42:36.949423   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:36.949428   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:36.951858   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:37.449504   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:37.449528   37081 round_trippers.go:469] Request Headers:
I0717 17:42:37.449535   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:37.449540   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:37.452005   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:37.949786   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:37.949809   37081 round_trippers.go:469] Request Headers:
I0717 17:42:37.949816   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:37.949821   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:37.951863   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:38.449617   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:38.449640   37081 round_trippers.go:469] Request Headers:
I0717 17:42:38.449647   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:38.449650   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:38.451786   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:38.451886   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:38.949432   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:38.949455   37081 round_trippers.go:469] Request Headers:
I0717 17:42:38.949463   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:38.949468   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:38.952153   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:39.450162   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:39.450188   37081 round_trippers.go:469] Request Headers:
I0717 17:42:39.450200   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:39.450208   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:39.452881   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:39.949935   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:39.949958   37081 round_trippers.go:469] Request Headers:
I0717 17:42:39.949964   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:39.949967   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:39.952181   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:40.449746   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:40.449772   37081 round_trippers.go:469] Request Headers:
I0717 17:42:40.449784   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:40.449789   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:40.452136   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:40.452234   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:40.949863   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:40.949884   37081 round_trippers.go:469] Request Headers:
I0717 17:42:40.949893   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:40.949898   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:40.952341   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:41.450082   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:41.450108   37081 round_trippers.go:469] Request Headers:
I0717 17:42:41.450127   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:41.450133   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:41.452540   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:41.950304   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:41.950339   37081 round_trippers.go:469] Request Headers:
I0717 17:42:41.950354   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:41.950359   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:41.952586   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:42.449269   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:42.449292   37081 round_trippers.go:469] Request Headers:
I0717 17:42:42.449303   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:42.449310   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:42.451834   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:42.949491   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:42.949518   37081 round_trippers.go:469] Request Headers:
I0717 17:42:42.949529   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:42.949538   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:42.951893   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:42.952034   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:43.449574   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:43.449600   37081 round_trippers.go:469] Request Headers:
I0717 17:42:43.449606   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:43.449611   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:43.452049   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:43.949752   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:43.949776   37081 round_trippers.go:469] Request Headers:
I0717 17:42:43.949785   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:43.949789   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:43.952469   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:44.450210   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:44.450232   37081 round_trippers.go:469] Request Headers:
I0717 17:42:44.450243   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:44.450248   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:44.458246   37081 round_trippers.go:574] Response Status: 404 Not Found in 7 milliseconds
I0717 17:42:44.950040   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:44.950067   37081 round_trippers.go:469] Request Headers:
I0717 17:42:44.950079   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:44.950086   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:44.952904   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:44.953011   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:45.449243   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:45.449266   37081 round_trippers.go:469] Request Headers:
I0717 17:42:45.449274   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:45.449279   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:45.451684   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:45.949326   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:45.949346   37081 round_trippers.go:469] Request Headers:
I0717 17:42:45.949354   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:45.949359   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:45.952193   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:46.449901   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:46.449922   37081 round_trippers.go:469] Request Headers:
I0717 17:42:46.449930   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:46.449935   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:46.452037   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:46.949713   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:46.949735   37081 round_trippers.go:469] Request Headers:
I0717 17:42:46.949741   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:46.949746   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:46.952339   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:47.450093   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:47.450136   37081 round_trippers.go:469] Request Headers:
I0717 17:42:47.450149   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:47.450153   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:47.452888   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:47.453002   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:47.949514   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:47.949539   37081 round_trippers.go:469] Request Headers:
I0717 17:42:47.949547   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:47.949551   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:47.952084   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:48.449750   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:48.449774   37081 round_trippers.go:469] Request Headers:
I0717 17:42:48.449782   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:48.449788   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:48.451639   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:42:48.949299   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:48.949320   37081 round_trippers.go:469] Request Headers:
I0717 17:42:48.949327   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:48.949331   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:48.951780   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:49.449511   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:49.449541   37081 round_trippers.go:469] Request Headers:
I0717 17:42:49.449549   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:49.449554   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:49.452235   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:49.949586   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:49.949615   37081 round_trippers.go:469] Request Headers:
I0717 17:42:49.949627   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:49.949634   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:49.952160   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:49.952256   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:50.449669   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:50.449692   37081 round_trippers.go:469] Request Headers:
I0717 17:42:50.449698   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:50.449703   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:50.452052   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:50.949707   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:50.949732   37081 round_trippers.go:469] Request Headers:
I0717 17:42:50.949739   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:50.949743   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:50.952243   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:51.449991   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:51.450014   37081 round_trippers.go:469] Request Headers:
I0717 17:42:51.450023   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:51.450028   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:51.452520   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:51.950261   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:51.950284   37081 round_trippers.go:469] Request Headers:
I0717 17:42:51.950293   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:51.950298   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:51.952661   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:51.952756   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:52.449337   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:52.449362   37081 round_trippers.go:469] Request Headers:
I0717 17:42:52.449370   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:52.449376   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:52.451885   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:52.949720   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:52.949750   37081 round_trippers.go:469] Request Headers:
I0717 17:42:52.949761   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:52.949767   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:52.952216   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:53.449991   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:53.450012   37081 round_trippers.go:469] Request Headers:
I0717 17:42:53.450021   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:53.450023   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:53.452656   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:53.949393   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:53.949417   37081 round_trippers.go:469] Request Headers:
I0717 17:42:53.949425   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:53.949428   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:53.951962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:54.449616   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:54.449637   37081 round_trippers.go:469] Request Headers:
I0717 17:42:54.449645   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:54.449650   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:54.451976   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:54.452072   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:54.949841   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:54.949863   37081 round_trippers.go:469] Request Headers:
I0717 17:42:54.949871   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:54.949876   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:54.952235   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:55.449784   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:55.449806   37081 round_trippers.go:469] Request Headers:
I0717 17:42:55.449813   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:55.449818   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:55.452656   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:55.949958   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:55.949989   37081 round_trippers.go:469] Request Headers:
I0717 17:42:55.950000   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:55.950007   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:55.953067   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:42:56.449762   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:56.449792   37081 round_trippers.go:469] Request Headers:
I0717 17:42:56.449801   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:56.449808   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:56.452280   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:56.452382   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:56.950004   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:56.950027   37081 round_trippers.go:469] Request Headers:
I0717 17:42:56.950053   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:56.950058   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:56.952350   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:57.450085   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:57.450106   37081 round_trippers.go:469] Request Headers:
I0717 17:42:57.450114   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:57.450126   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:57.452524   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:57.949237   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:57.949259   37081 round_trippers.go:469] Request Headers:
I0717 17:42:57.949268   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:57.949273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:57.951512   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:58.450248   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:58.450269   37081 round_trippers.go:469] Request Headers:
I0717 17:42:58.450276   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:58.450280   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:58.452532   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:58.452640   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:42:58.950235   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:58.950256   37081 round_trippers.go:469] Request Headers:
I0717 17:42:58.950264   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:58.950267   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:58.952459   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:59.450196   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:59.450218   37081 round_trippers.go:469] Request Headers:
I0717 17:42:59.450229   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:59.450234   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:59.452402   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:42:59.949216   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:42:59.949254   37081 round_trippers.go:469] Request Headers:
I0717 17:42:59.949266   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:42:59.949271   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:42:59.952323   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:00.449960   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:00.449987   37081 round_trippers.go:469] Request Headers:
I0717 17:43:00.449998   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:00.450002   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:00.452516   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:00.950240   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:00.950262   37081 round_trippers.go:469] Request Headers:
I0717 17:43:00.950270   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:00.950275   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:00.953877   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:00.953996   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:01.449581   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:01.449614   37081 round_trippers.go:469] Request Headers:
I0717 17:43:01.449622   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:01.449627   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:01.452177   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:01.949892   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:01.949915   37081 round_trippers.go:469] Request Headers:
I0717 17:43:01.949922   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:01.949926   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:01.952590   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:02.449219   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:02.449252   37081 round_trippers.go:469] Request Headers:
I0717 17:43:02.449259   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:02.449264   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:02.451507   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:02.950265   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:02.950288   37081 round_trippers.go:469] Request Headers:
I0717 17:43:02.950297   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:02.950302   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:02.952577   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:03.449245   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:03.449268   37081 round_trippers.go:469] Request Headers:
I0717 17:43:03.449278   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:03.449284   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:03.451802   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:03.451894   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:03.949467   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:03.949489   37081 round_trippers.go:469] Request Headers:
I0717 17:43:03.949498   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:03.949504   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:03.951569   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:04.449233   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:04.449258   37081 round_trippers.go:469] Request Headers:
I0717 17:43:04.449268   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:04.449273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:04.451591   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:04.949228   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:04.949249   37081 round_trippers.go:469] Request Headers:
I0717 17:43:04.949256   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:04.949260   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:04.952224   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:05.449983   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:05.450009   37081 round_trippers.go:469] Request Headers:
I0717 17:43:05.450020   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:05.450026   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:05.452405   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:05.452513   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:05.950170   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:05.950190   37081 round_trippers.go:469] Request Headers:
I0717 17:43:05.950198   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:05.950205   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:05.952659   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:06.449329   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:06.449353   37081 round_trippers.go:469] Request Headers:
I0717 17:43:06.449361   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:06.449365   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:06.451851   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:06.949473   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:06.949515   37081 round_trippers.go:469] Request Headers:
I0717 17:43:06.949523   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:06.949530   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:06.952103   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:07.449761   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:07.449791   37081 round_trippers.go:469] Request Headers:
I0717 17:43:07.449802   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:07.449808   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:07.452032   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:07.949735   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:07.949758   37081 round_trippers.go:469] Request Headers:
I0717 17:43:07.949766   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:07.949769   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:07.953319   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:07.953438   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:08.450024   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:08.450047   37081 round_trippers.go:469] Request Headers:
I0717 17:43:08.450056   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:08.450059   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:08.452208   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:08.949907   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:08.949928   37081 round_trippers.go:469] Request Headers:
I0717 17:43:08.949936   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:08.949941   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:08.952344   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:09.450048   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:09.450069   37081 round_trippers.go:469] Request Headers:
I0717 17:43:09.450077   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:09.450080   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:09.452429   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:09.949367   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:09.949393   37081 round_trippers.go:469] Request Headers:
I0717 17:43:09.949406   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:09.949412   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:09.951593   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:10.450155   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:10.450181   37081 round_trippers.go:469] Request Headers:
I0717 17:43:10.450193   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:10.450197   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:10.452694   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:10.452818   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:10.949272   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:10.949296   37081 round_trippers.go:469] Request Headers:
I0717 17:43:10.949307   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:10.949313   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:10.951792   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:11.449438   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:11.449462   37081 round_trippers.go:469] Request Headers:
I0717 17:43:11.449473   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:11.449479   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:11.451715   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:11.949361   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:11.949383   37081 round_trippers.go:469] Request Headers:
I0717 17:43:11.949391   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:11.949395   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:11.951709   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:12.449339   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:12.449361   37081 round_trippers.go:469] Request Headers:
I0717 17:43:12.449369   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:12.449374   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:12.451777   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:12.949429   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:12.949456   37081 round_trippers.go:469] Request Headers:
I0717 17:43:12.949469   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:12.949475   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:12.951757   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:12.951900   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:13.449402   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:13.449424   37081 round_trippers.go:469] Request Headers:
I0717 17:43:13.449431   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:13.449435   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:13.451820   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:13.949514   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:13.949536   37081 round_trippers.go:469] Request Headers:
I0717 17:43:13.949544   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:13.949548   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:13.951751   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:14.449409   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:14.449430   37081 round_trippers.go:469] Request Headers:
I0717 17:43:14.449438   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:14.449443   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:14.451736   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:14.949496   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:14.949518   37081 round_trippers.go:469] Request Headers:
I0717 17:43:14.949525   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:14.949544   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:14.951847   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:14.951954   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:15.449514   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:15.449535   37081 round_trippers.go:469] Request Headers:
I0717 17:43:15.449551   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:15.449555   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:15.452399   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:15.950168   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:15.950189   37081 round_trippers.go:469] Request Headers:
I0717 17:43:15.950197   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:15.950201   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:15.952466   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:16.450211   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:16.450235   37081 round_trippers.go:469] Request Headers:
I0717 17:43:16.450242   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:16.450248   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:16.452886   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:16.949534   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:16.949559   37081 round_trippers.go:469] Request Headers:
I0717 17:43:16.949569   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:16.949577   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:16.952044   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:16.952163   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:17.449782   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:17.449806   37081 round_trippers.go:469] Request Headers:
I0717 17:43:17.449818   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:17.449823   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:17.452104   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:17.949825   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:17.949852   37081 round_trippers.go:469] Request Headers:
I0717 17:43:17.949863   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:17.949867   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:17.953070   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:18.449281   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:18.449303   37081 round_trippers.go:469] Request Headers:
I0717 17:43:18.449311   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:18.449314   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:18.451376   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:18.950155   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:18.950178   37081 round_trippers.go:469] Request Headers:
I0717 17:43:18.950186   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:18.950189   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:18.953193   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:18.953497   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:19.449703   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:19.449730   37081 round_trippers.go:469] Request Headers:
I0717 17:43:19.449738   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:19.449741   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:19.452061   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:19.950178   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:19.950199   37081 round_trippers.go:469] Request Headers:
I0717 17:43:19.950206   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:19.950212   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:19.952244   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:20.449855   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:20.449880   37081 round_trippers.go:469] Request Headers:
I0717 17:43:20.449892   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:20.449898   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:20.452427   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:20.950249   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:20.950277   37081 round_trippers.go:469] Request Headers:
I0717 17:43:20.950287   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:20.950296   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:20.953464   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:20.953576   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:21.450167   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:21.450189   37081 round_trippers.go:469] Request Headers:
I0717 17:43:21.450198   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:21.450221   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:21.452953   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:21.949607   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:21.949637   37081 round_trippers.go:469] Request Headers:
I0717 17:43:21.949645   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:21.949650   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:21.951991   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:22.449634   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:22.449655   37081 round_trippers.go:469] Request Headers:
I0717 17:43:22.449663   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:22.449667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:22.452165   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:22.949903   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:22.949926   37081 round_trippers.go:469] Request Headers:
I0717 17:43:22.949933   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:22.949936   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:22.952319   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:23.450026   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:23.450047   37081 round_trippers.go:469] Request Headers:
I0717 17:43:23.450055   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:23.450059   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:23.452411   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:23.452508   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:23.949674   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:23.949701   37081 round_trippers.go:469] Request Headers:
I0717 17:43:23.949711   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:23.949715   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:23.951729   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:43:24.449389   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:24.449412   37081 round_trippers.go:469] Request Headers:
I0717 17:43:24.449420   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:24.449425   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:24.452037   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:24.950146   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:24.950170   37081 round_trippers.go:469] Request Headers:
I0717 17:43:24.950182   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:24.950188   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:24.952447   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:25.450003   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:25.450040   37081 round_trippers.go:469] Request Headers:
I0717 17:43:25.450049   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:25.450056   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:25.452607   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:25.452697   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:25.949278   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:25.949304   37081 round_trippers.go:469] Request Headers:
I0717 17:43:25.949314   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:25.949319   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:25.951746   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:26.449357   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:26.449377   37081 round_trippers.go:469] Request Headers:
I0717 17:43:26.449384   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:26.449389   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:26.452345   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:26.950085   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:26.950106   37081 round_trippers.go:469] Request Headers:
I0717 17:43:26.950131   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:26.950136   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:26.952493   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:27.450218   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:27.450234   37081 round_trippers.go:469] Request Headers:
I0717 17:43:27.450242   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:27.450246   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:27.452891   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:27.453001   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:27.949641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:27.949665   37081 round_trippers.go:469] Request Headers:
I0717 17:43:27.949675   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:27.949680   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:27.952429   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:28.450202   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:28.450224   37081 round_trippers.go:469] Request Headers:
I0717 17:43:28.450233   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:28.450238   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:28.453125   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:28.949381   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:28.949406   37081 round_trippers.go:469] Request Headers:
I0717 17:43:28.949417   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:28.949420   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:28.960303   37081 round_trippers.go:574] Response Status: 404 Not Found in 10 milliseconds
I0717 17:43:29.450096   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:29.450148   37081 round_trippers.go:469] Request Headers:
I0717 17:43:29.450160   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:29.450171   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:29.452703   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:29.949805   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:29.949830   37081 round_trippers.go:469] Request Headers:
I0717 17:43:29.949840   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:29.949845   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:29.951897   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:29.951993   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:30.449276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:30.449321   37081 round_trippers.go:469] Request Headers:
I0717 17:43:30.449329   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:30.449335   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:30.451746   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:30.949327   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:30.949349   37081 round_trippers.go:469] Request Headers:
I0717 17:43:30.949356   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:30.949360   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:30.951459   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:31.449264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:31.449285   37081 round_trippers.go:469] Request Headers:
I0717 17:43:31.449293   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:31.449299   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:31.451885   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:31.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:31.949613   37081 round_trippers.go:469] Request Headers:
I0717 17:43:31.949622   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:31.949626   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:31.952081   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:31.952199   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:32.449833   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:32.449857   37081 round_trippers.go:469] Request Headers:
I0717 17:43:32.449868   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:32.449876   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:32.452407   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:32.950175   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:32.950198   37081 round_trippers.go:469] Request Headers:
I0717 17:43:32.950207   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:32.950212   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:32.952414   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:33.450161   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:33.450208   37081 round_trippers.go:469] Request Headers:
I0717 17:43:33.450216   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:33.450220   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:33.452853   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:33.949508   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:33.949530   37081 round_trippers.go:469] Request Headers:
I0717 17:43:33.949538   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:33.949541   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:33.951578   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:34.449232   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:34.449255   37081 round_trippers.go:469] Request Headers:
I0717 17:43:34.449263   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:34.449266   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:34.451790   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:34.451921   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:34.949346   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:34.949370   37081 round_trippers.go:469] Request Headers:
I0717 17:43:34.949383   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:34.949394   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:34.951561   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:35.450238   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:35.450262   37081 round_trippers.go:469] Request Headers:
I0717 17:43:35.450270   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:35.450273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:35.452939   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:35.949621   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:35.949652   37081 round_trippers.go:469] Request Headers:
I0717 17:43:35.949663   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:35.949668   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:35.951728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:36.449403   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:36.449425   37081 round_trippers.go:469] Request Headers:
I0717 17:43:36.449432   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:36.449436   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:36.452967   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:36.453073   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:36.949625   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:36.949655   37081 round_trippers.go:469] Request Headers:
I0717 17:43:36.949667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:36.949676   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:36.953243   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:37.449998   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:37.450023   37081 round_trippers.go:469] Request Headers:
I0717 17:43:37.450031   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:37.450035   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:37.452305   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:37.950171   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:37.950191   37081 round_trippers.go:469] Request Headers:
I0717 17:43:37.950199   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:37.950202   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:37.953584   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:38.450337   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:38.450360   37081 round_trippers.go:469] Request Headers:
I0717 17:43:38.450370   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:38.450374   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:38.453142   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:38.453253   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:38.949850   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:38.949890   37081 round_trippers.go:469] Request Headers:
I0717 17:43:38.949898   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:38.949901   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:38.952105   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:39.449815   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:39.449860   37081 round_trippers.go:469] Request Headers:
I0717 17:43:39.449871   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:39.449875   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:39.452559   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:39.949566   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:39.949593   37081 round_trippers.go:469] Request Headers:
I0717 17:43:39.949602   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:39.949608   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:39.952007   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:40.449641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:40.449663   37081 round_trippers.go:469] Request Headers:
I0717 17:43:40.449671   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:40.449676   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:40.452044   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:40.949716   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:40.949739   37081 round_trippers.go:469] Request Headers:
I0717 17:43:40.949747   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:40.949751   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:40.952436   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:40.952607   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:41.450241   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:41.450267   37081 round_trippers.go:469] Request Headers:
I0717 17:43:41.450280   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:41.450285   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:41.452809   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:41.949507   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:41.949533   37081 round_trippers.go:469] Request Headers:
I0717 17:43:41.949552   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:41.949557   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:41.952213   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:42.450021   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:42.450041   37081 round_trippers.go:469] Request Headers:
I0717 17:43:42.450050   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:42.450055   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:42.453323   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:42.950057   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:42.950077   37081 round_trippers.go:469] Request Headers:
I0717 17:43:42.950084   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:42.950088   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:42.952462   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:43.450226   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:43.450249   37081 round_trippers.go:469] Request Headers:
I0717 17:43:43.450257   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:43.450260   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:43.452497   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:43.452606   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:43.950284   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:43.950304   37081 round_trippers.go:469] Request Headers:
I0717 17:43:43.950313   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:43.950317   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:43.952590   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:44.449264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:44.449289   37081 round_trippers.go:469] Request Headers:
I0717 17:43:44.449300   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:44.449307   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:44.451645   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:44.949270   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:44.949291   37081 round_trippers.go:469] Request Headers:
I0717 17:43:44.949299   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:44.949303   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:44.952004   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:45.449741   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:45.449765   37081 round_trippers.go:469] Request Headers:
I0717 17:43:45.449773   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:45.449776   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:45.452283   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:45.950097   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:45.950138   37081 round_trippers.go:469] Request Headers:
I0717 17:43:45.950155   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:45.950159   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:45.952273   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:45.952358   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:46.450072   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:46.450099   37081 round_trippers.go:469] Request Headers:
I0717 17:43:46.450109   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:46.450134   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:46.452717   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:46.949389   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:46.949410   37081 round_trippers.go:469] Request Headers:
I0717 17:43:46.949418   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:46.949421   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:46.952138   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:47.449897   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:47.449924   37081 round_trippers.go:469] Request Headers:
I0717 17:43:47.449934   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:47.449939   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:47.453058   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:43:47.949786   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:47.949821   37081 round_trippers.go:469] Request Headers:
I0717 17:43:47.949829   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:47.949834   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:47.952393   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:47.952494   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:48.450108   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:48.450148   37081 round_trippers.go:469] Request Headers:
I0717 17:43:48.450158   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:48.450164   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:48.453081   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:48.949991   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:48.950014   37081 round_trippers.go:469] Request Headers:
I0717 17:43:48.950023   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:48.950027   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:48.952072   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:49.449744   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:49.449768   37081 round_trippers.go:469] Request Headers:
I0717 17:43:49.449778   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:49.449785   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:49.452085   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:49.950032   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:49.950056   37081 round_trippers.go:469] Request Headers:
I0717 17:43:49.950066   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:49.950071   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:49.952408   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:50.450005   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:50.450028   37081 round_trippers.go:469] Request Headers:
I0717 17:43:50.450038   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:50.450042   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:50.452729   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:50.452814   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:50.949373   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:50.949394   37081 round_trippers.go:469] Request Headers:
I0717 17:43:50.949402   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:50.949406   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:50.951516   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:51.450300   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:51.450328   37081 round_trippers.go:469] Request Headers:
I0717 17:43:51.450338   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:51.450343   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:51.452766   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:51.949423   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:51.949443   37081 round_trippers.go:469] Request Headers:
I0717 17:43:51.949451   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:51.949455   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:51.951817   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:52.449453   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:52.449478   37081 round_trippers.go:469] Request Headers:
I0717 17:43:52.449488   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:52.449493   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:52.451781   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:52.949436   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:52.949463   37081 round_trippers.go:469] Request Headers:
I0717 17:43:52.949475   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:52.949480   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:52.951832   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:52.951947   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:53.449339   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:53.449363   37081 round_trippers.go:469] Request Headers:
I0717 17:43:53.449371   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:53.449380   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:53.452104   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:53.949859   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:53.949880   37081 round_trippers.go:469] Request Headers:
I0717 17:43:53.949888   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:53.949891   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:53.952510   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:54.450264   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:54.450293   37081 round_trippers.go:469] Request Headers:
I0717 17:43:54.450304   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:54.450310   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:54.452913   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:54.949635   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:54.949657   37081 round_trippers.go:469] Request Headers:
I0717 17:43:54.949665   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:54.949670   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:54.952066   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:54.952157   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:55.449659   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:55.449690   37081 round_trippers.go:469] Request Headers:
I0717 17:43:55.449702   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:55.449710   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:55.452066   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:55.949743   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:55.949769   37081 round_trippers.go:469] Request Headers:
I0717 17:43:55.949781   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:55.949785   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:55.952471   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:56.449798   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:56.449819   37081 round_trippers.go:469] Request Headers:
I0717 17:43:56.449828   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:56.449832   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:56.452393   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:56.950161   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:56.950185   37081 round_trippers.go:469] Request Headers:
I0717 17:43:56.950193   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:56.950197   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:56.952650   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:56.952767   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:57.450285   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:57.450313   37081 round_trippers.go:469] Request Headers:
I0717 17:43:57.450325   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:57.450330   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:57.452799   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:57.949533   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:57.949554   37081 round_trippers.go:469] Request Headers:
I0717 17:43:57.949562   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:57.949565   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:57.951955   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:58.449776   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:58.449802   37081 round_trippers.go:469] Request Headers:
I0717 17:43:58.449814   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:58.449825   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:58.452100   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:58.949749   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:58.949771   37081 round_trippers.go:469] Request Headers:
I0717 17:43:58.949781   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:58.949787   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:58.952182   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:59.449929   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:59.449951   37081 round_trippers.go:469] Request Headers:
I0717 17:43:59.449959   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:59.449964   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:59.452235   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:43:59.452351   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:43:59.950255   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:43:59.950280   37081 round_trippers.go:469] Request Headers:
I0717 17:43:59.950292   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:43:59.950300   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:43:59.952723   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:00.449270   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:00.449294   37081 round_trippers.go:469] Request Headers:
I0717 17:44:00.449304   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:00.449309   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:00.452444   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:44:00.950203   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:00.950225   37081 round_trippers.go:469] Request Headers:
I0717 17:44:00.950232   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:00.950236   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:00.952504   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:01.450253   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:01.450275   37081 round_trippers.go:469] Request Headers:
I0717 17:44:01.450282   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:01.450286   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:01.452728   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:01.452839   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:01.949432   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:01.949459   37081 round_trippers.go:469] Request Headers:
I0717 17:44:01.949469   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:01.949474   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:01.951965   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:02.449629   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:02.449654   37081 round_trippers.go:469] Request Headers:
I0717 17:44:02.449663   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:02.449669   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:02.452190   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:02.949992   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:02.950013   37081 round_trippers.go:469] Request Headers:
I0717 17:44:02.950021   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:02.950025   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:02.952338   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:03.449636   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:03.449659   37081 round_trippers.go:469] Request Headers:
I0717 17:44:03.449669   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:03.449675   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:03.452455   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:03.950231   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:03.950254   37081 round_trippers.go:469] Request Headers:
I0717 17:44:03.950262   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:03.950266   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:03.952472   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:03.952579   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:04.450285   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:04.450310   37081 round_trippers.go:469] Request Headers:
I0717 17:44:04.450320   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:04.450323   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:04.452910   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:04.949616   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:04.949647   37081 round_trippers.go:469] Request Headers:
I0717 17:44:04.949660   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:04.949667   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:04.952331   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:05.449809   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:05.449830   37081 round_trippers.go:469] Request Headers:
I0717 17:44:05.449838   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:05.449841   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:05.452782   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:05.949342   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:05.949364   37081 round_trippers.go:469] Request Headers:
I0717 17:44:05.949372   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:05.949375   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:05.951801   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:06.449432   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:06.449455   37081 round_trippers.go:469] Request Headers:
I0717 17:44:06.449463   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:06.449467   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:06.452191   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:06.452308   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:06.949911   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:06.949935   37081 round_trippers.go:469] Request Headers:
I0717 17:44:06.949945   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:06.949949   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:06.952099   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:07.449779   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:07.449800   37081 round_trippers.go:469] Request Headers:
I0717 17:44:07.449808   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:07.449811   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:07.451995   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:07.949744   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:07.949771   37081 round_trippers.go:469] Request Headers:
I0717 17:44:07.949782   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:07.949788   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:07.952436   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:08.450209   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:08.450233   37081 round_trippers.go:469] Request Headers:
I0717 17:44:08.450244   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:08.450250   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:08.453079   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:08.453177   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:08.949739   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:08.949762   37081 round_trippers.go:469] Request Headers:
I0717 17:44:08.949770   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:08.949774   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:08.951845   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:09.449385   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:09.449407   37081 round_trippers.go:469] Request Headers:
I0717 17:44:09.449415   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:09.449418   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:09.452156   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:09.950165   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:09.950185   37081 round_trippers.go:469] Request Headers:
I0717 17:44:09.950192   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:09.950198   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:09.952468   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:10.450275   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:10.450297   37081 round_trippers.go:469] Request Headers:
I0717 17:44:10.450305   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:10.450314   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:10.452652   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:10.949338   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:10.949360   37081 round_trippers.go:469] Request Headers:
I0717 17:44:10.949368   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:10.949371   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:10.951962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:10.952061   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:11.449611   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:11.449633   37081 round_trippers.go:469] Request Headers:
I0717 17:44:11.449640   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:11.449644   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:11.452208   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:11.950022   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:11.950045   37081 round_trippers.go:469] Request Headers:
I0717 17:44:11.950053   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:11.950057   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:11.952633   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:12.449271   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:12.449294   37081 round_trippers.go:469] Request Headers:
I0717 17:44:12.449301   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:12.449305   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:12.452220   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:12.949986   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:12.950007   37081 round_trippers.go:469] Request Headers:
I0717 17:44:12.950019   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:12.950024   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:12.953013   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:12.953114   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:13.449706   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:13.449731   37081 round_trippers.go:469] Request Headers:
I0717 17:44:13.449738   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:13.449743   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:13.452118   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:13.949877   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:13.949900   37081 round_trippers.go:469] Request Headers:
I0717 17:44:13.949908   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:13.949913   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:13.952803   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:14.449525   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:14.449551   37081 round_trippers.go:469] Request Headers:
I0717 17:44:14.449563   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:14.449571   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:14.452506   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:14.949258   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:14.949282   37081 round_trippers.go:469] Request Headers:
I0717 17:44:14.949296   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:14.949302   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:14.951822   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:15.449584   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:15.449605   37081 round_trippers.go:469] Request Headers:
I0717 17:44:15.449613   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:15.449616   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:15.452927   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:44:15.453039   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:15.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:15.949607   37081 round_trippers.go:469] Request Headers:
I0717 17:44:15.949614   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:15.949617   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:15.952886   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:44:16.449537   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:16.449572   37081 round_trippers.go:469] Request Headers:
I0717 17:44:16.449580   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:16.449585   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:16.452021   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:16.949705   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:16.949737   37081 round_trippers.go:469] Request Headers:
I0717 17:44:16.949747   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:16.949754   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:16.952287   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:17.450010   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:17.450031   37081 round_trippers.go:469] Request Headers:
I0717 17:44:17.450039   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:17.450043   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:17.452650   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:17.949451   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:17.949475   37081 round_trippers.go:469] Request Headers:
I0717 17:44:17.949486   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:17.949491   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:17.953180   37081 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
I0717 17:44:17.953292   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:18.449869   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:18.449901   37081 round_trippers.go:469] Request Headers:
I0717 17:44:18.449910   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:18.449914   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:18.452248   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:18.949974   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:18.949997   37081 round_trippers.go:469] Request Headers:
I0717 17:44:18.950007   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:18.950013   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:18.952145   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:19.449900   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:19.449921   37081 round_trippers.go:469] Request Headers:
I0717 17:44:19.449929   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:19.449934   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:19.452221   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:19.949192   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:19.949213   37081 round_trippers.go:469] Request Headers:
I0717 17:44:19.949221   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:19.949226   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:19.951591   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:20.450198   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:20.450220   37081 round_trippers.go:469] Request Headers:
I0717 17:44:20.450228   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:20.450232   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:20.452626   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:20.452741   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:20.949250   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:20.949270   37081 round_trippers.go:469] Request Headers:
I0717 17:44:20.949277   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:20.949282   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:20.951567   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:21.449273   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:21.449297   37081 round_trippers.go:469] Request Headers:
I0717 17:44:21.449304   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:21.449307   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:21.451675   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:21.949322   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:21.949341   37081 round_trippers.go:469] Request Headers:
I0717 17:44:21.949348   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:21.949353   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:21.951532   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:22.450299   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:22.450327   37081 round_trippers.go:469] Request Headers:
I0717 17:44:22.450338   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:22.450344   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:22.452758   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:22.452850   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:22.950011   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:22.950047   37081 round_trippers.go:469] Request Headers:
I0717 17:44:22.950058   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:22.950067   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:22.952399   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:23.450155   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:23.450184   37081 round_trippers.go:469] Request Headers:
I0717 17:44:23.450196   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:23.450199   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:23.452635   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:23.949282   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:23.949319   37081 round_trippers.go:469] Request Headers:
I0717 17:44:23.949327   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:23.949332   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:23.951473   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:24.450191   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:24.450214   37081 round_trippers.go:469] Request Headers:
I0717 17:44:24.450223   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:24.450227   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:24.452721   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:24.949245   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:24.949270   37081 round_trippers.go:469] Request Headers:
I0717 17:44:24.949277   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:24.949282   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:24.951300   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:24.951403   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:25.449870   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:25.449890   37081 round_trippers.go:469] Request Headers:
I0717 17:44:25.449898   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:25.449902   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:25.452707   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:25.949253   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:25.949276   37081 round_trippers.go:469] Request Headers:
I0717 17:44:25.949284   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:25.949289   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:25.952042   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:26.449709   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:26.449732   37081 round_trippers.go:469] Request Headers:
I0717 17:44:26.449747   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:26.449753   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:26.451889   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:26.949593   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:26.949617   37081 round_trippers.go:469] Request Headers:
I0717 17:44:26.949627   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:26.949631   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:26.951301   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:44:26.951462   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:27.449611   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:27.449635   37081 round_trippers.go:469] Request Headers:
I0717 17:44:27.449643   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:27.449648   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:27.452090   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:27.949962   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:27.949987   37081 round_trippers.go:469] Request Headers:
I0717 17:44:27.950005   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:27.950010   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:27.952081   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:28.449736   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:28.449761   37081 round_trippers.go:469] Request Headers:
I0717 17:44:28.449773   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:28.449779   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:28.451848   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:28.949509   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:28.949531   37081 round_trippers.go:469] Request Headers:
I0717 17:44:28.949539   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:28.949543   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:28.951983   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:28.952103   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:29.449597   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:29.449622   37081 round_trippers.go:469] Request Headers:
I0717 17:44:29.449630   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:29.449635   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:29.452069   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:29.950006   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:29.950029   37081 round_trippers.go:469] Request Headers:
I0717 17:44:29.950039   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:29.950043   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:29.952677   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:30.450202   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:30.450227   37081 round_trippers.go:469] Request Headers:
I0717 17:44:30.450239   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:30.450244   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:30.452848   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:30.949540   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:30.949562   37081 round_trippers.go:469] Request Headers:
I0717 17:44:30.949569   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:30.949573   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:30.951882   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:31.449606   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:31.449629   37081 round_trippers.go:469] Request Headers:
I0717 17:44:31.449639   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:31.449643   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:31.452075   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:31.452193   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:31.949707   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:31.949738   37081 round_trippers.go:469] Request Headers:
I0717 17:44:31.949749   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:31.949753   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:31.952409   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:32.450111   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:32.450144   37081 round_trippers.go:469] Request Headers:
I0717 17:44:32.450152   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:32.450155   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:32.452581   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:32.949254   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:32.949277   37081 round_trippers.go:469] Request Headers:
I0717 17:44:32.949287   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:32.949292   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:32.951662   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:33.449313   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:33.449356   37081 round_trippers.go:469] Request Headers:
I0717 17:44:33.449367   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:33.449373   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:33.451866   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:33.949526   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:33.949545   37081 round_trippers.go:469] Request Headers:
I0717 17:44:33.949553   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:33.949558   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:33.951561   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:44:33.951676   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:34.449236   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:34.449257   37081 round_trippers.go:469] Request Headers:
I0717 17:44:34.449266   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:34.449270   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:34.451490   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:34.949232   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:34.949253   37081 round_trippers.go:469] Request Headers:
I0717 17:44:34.949261   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:34.949265   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:34.951833   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:35.449463   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:35.449483   37081 round_trippers.go:469] Request Headers:
I0717 17:44:35.449490   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:35.449494   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:35.451788   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:35.949441   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:35.949465   37081 round_trippers.go:469] Request Headers:
I0717 17:44:35.949473   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:35.949477   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:35.951679   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:35.951777   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:36.449328   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:36.449349   37081 round_trippers.go:469] Request Headers:
I0717 17:44:36.449358   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:36.449361   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:36.451819   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:36.949469   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:36.949492   37081 round_trippers.go:469] Request Headers:
I0717 17:44:36.949500   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:36.949503   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:36.951962   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:37.449641   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:37.449662   37081 round_trippers.go:469] Request Headers:
I0717 17:44:37.449670   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:37.449674   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:37.451999   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:37.949721   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:37.949745   37081 round_trippers.go:469] Request Headers:
I0717 17:44:37.949752   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:37.949757   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:37.952096   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:37.952210   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:38.449359   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:38.449382   37081 round_trippers.go:469] Request Headers:
I0717 17:44:38.449390   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:38.449393   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:38.452127   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:38.949866   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:38.949890   37081 round_trippers.go:469] Request Headers:
I0717 17:44:38.949899   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:38.949904   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:38.952243   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:39.449602   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:39.449624   37081 round_trippers.go:469] Request Headers:
I0717 17:44:39.449632   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:39.449637   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:39.452170   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:39.950010   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:39.950030   37081 round_trippers.go:469] Request Headers:
I0717 17:44:39.950038   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:39.950043   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:39.952179   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:39.952260   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:40.449694   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:40.449715   37081 round_trippers.go:469] Request Headers:
I0717 17:44:40.449726   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:40.449733   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:40.452021   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:40.949767   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:40.949794   37081 round_trippers.go:469] Request Headers:
I0717 17:44:40.949805   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:40.949809   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:40.952308   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:41.450023   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:41.450045   37081 round_trippers.go:469] Request Headers:
I0717 17:44:41.450052   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:41.450061   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:41.452372   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:41.950133   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:41.950159   37081 round_trippers.go:469] Request Headers:
I0717 17:44:41.950169   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:41.950173   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:41.952343   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:41.952449   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:42.450089   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:42.450111   37081 round_trippers.go:469] Request Headers:
I0717 17:44:42.450129   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:42.450137   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:42.452441   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:42.950199   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:42.950223   37081 round_trippers.go:469] Request Headers:
I0717 17:44:42.950230   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:42.950234   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:42.952567   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:43.449207   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:43.449245   37081 round_trippers.go:469] Request Headers:
I0717 17:44:43.449255   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:43.449260   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:43.451661   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:43.949276   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:43.949315   37081 round_trippers.go:469] Request Headers:
I0717 17:44:43.949323   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:43.949328   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:43.951510   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:44.450239   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:44.450265   37081 round_trippers.go:469] Request Headers:
I0717 17:44:44.450274   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:44.450278   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:44.452906   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:44.453018   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:44.949587   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:44.949608   37081 round_trippers.go:469] Request Headers:
I0717 17:44:44.949616   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:44.949619   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:44.951624   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:44:45.450017   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:45.450038   37081 round_trippers.go:469] Request Headers:
I0717 17:44:45.450047   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:45.450050   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:45.452427   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:45.950207   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:45.950231   37081 round_trippers.go:469] Request Headers:
I0717 17:44:45.950242   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:45.950247   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:45.952442   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:46.450186   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:46.450210   37081 round_trippers.go:469] Request Headers:
I0717 17:44:46.450222   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:46.450229   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:46.452460   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:46.950173   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:46.950196   37081 round_trippers.go:469] Request Headers:
I0717 17:44:46.950205   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:46.950210   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:46.952664   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:46.952869   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:47.449354   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:47.449375   37081 round_trippers.go:469] Request Headers:
I0717 17:44:47.449383   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:47.449387   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:47.451831   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:47.949561   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:47.949583   37081 round_trippers.go:469] Request Headers:
I0717 17:44:47.949591   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:47.949596   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:47.951797   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:48.449433   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:48.449455   37081 round_trippers.go:469] Request Headers:
I0717 17:44:48.449462   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:48.449471   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:48.451613   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:48.949282   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:48.949303   37081 round_trippers.go:469] Request Headers:
I0717 17:44:48.949311   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:48.949316   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:48.951384   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:49.450163   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:49.450186   37081 round_trippers.go:469] Request Headers:
I0717 17:44:49.450192   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:49.450196   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:49.452485   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:49.452604   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:49.950014   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:49.950036   37081 round_trippers.go:469] Request Headers:
I0717 17:44:49.950044   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:49.950049   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:49.952661   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:50.449236   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:50.449261   37081 round_trippers.go:469] Request Headers:
I0717 17:44:50.449269   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:50.449273   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:50.451559   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:50.949313   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:50.949334   37081 round_trippers.go:469] Request Headers:
I0717 17:44:50.949342   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:50.949346   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:50.951517   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:51.449322   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:51.449347   37081 round_trippers.go:469] Request Headers:
I0717 17:44:51.449358   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:51.449363   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:51.451910   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:51.949546   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:51.949587   37081 round_trippers.go:469] Request Headers:
I0717 17:44:51.949596   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:51.949601   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:51.951818   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:51.951915   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:52.449457   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:52.449484   37081 round_trippers.go:469] Request Headers:
I0717 17:44:52.449496   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:52.449503   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:52.451569   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:52.950109   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:52.950146   37081 round_trippers.go:469] Request Headers:
I0717 17:44:52.950154   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:52.950158   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:52.952456   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:53.450194   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:53.450215   37081 round_trippers.go:469] Request Headers:
I0717 17:44:53.450224   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:53.450228   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:53.452925   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:53.949603   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:53.949627   37081 round_trippers.go:469] Request Headers:
I0717 17:44:53.949636   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:53.949640   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:53.951992   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:53.952105   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:54.449662   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:54.449684   37081 round_trippers.go:469] Request Headers:
I0717 17:44:54.449692   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:54.449697   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:54.452086   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:54.949967   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:54.949992   37081 round_trippers.go:469] Request Headers:
I0717 17:44:54.950000   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:54.950006   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:54.952148   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:55.449585   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:55.449604   37081 round_trippers.go:469] Request Headers:
I0717 17:44:55.449612   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:55.449616   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:55.452219   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:55.950044   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:55.950068   37081 round_trippers.go:469] Request Headers:
I0717 17:44:55.950077   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:55.950083   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:55.954995   37081 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
I0717 17:44:55.955113   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:56.449365   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:56.449384   37081 round_trippers.go:469] Request Headers:
I0717 17:44:56.449393   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:56.449397   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:56.452012   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:56.949696   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:56.949718   37081 round_trippers.go:469] Request Headers:
I0717 17:44:56.949728   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:56.949732   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:56.951970   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:57.449647   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:57.449671   37081 round_trippers.go:469] Request Headers:
I0717 17:44:57.449681   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:57.449685   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:57.451813   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:57.949469   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:57.949500   37081 round_trippers.go:469] Request Headers:
I0717 17:44:57.949508   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:57.949513   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:57.951835   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:58.449444   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:58.449483   37081 round_trippers.go:469] Request Headers:
I0717 17:44:58.449491   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:58.449496   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:58.451784   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:58.451916   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:44:58.949414   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:58.949434   37081 round_trippers.go:469] Request Headers:
I0717 17:44:58.949442   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:58.949446   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:58.952341   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:59.449663   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:59.449684   37081 round_trippers.go:469] Request Headers:
I0717 17:44:59.449692   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:59.449696   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:59.451808   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:44:59.949569   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:44:59.949593   37081 round_trippers.go:469] Request Headers:
I0717 17:44:59.949602   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:44:59.949606   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:44:59.951748   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:00.449229   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:00.449252   37081 round_trippers.go:469] Request Headers:
I0717 17:45:00.449261   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:00.449266   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:00.451495   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:00.950164   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:00.950187   37081 round_trippers.go:469] Request Headers:
I0717 17:45:00.950195   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:00.950201   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:00.952265   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:00.952373   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:45:01.449827   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:01.449849   37081 round_trippers.go:469] Request Headers:
I0717 17:45:01.449858   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:01.449863   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:01.452083   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:01.949772   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:01.949798   37081 round_trippers.go:469] Request Headers:
I0717 17:45:01.949810   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:01.949815   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:01.952012   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:02.449672   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:02.449693   37081 round_trippers.go:469] Request Headers:
I0717 17:45:02.449700   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:02.449704   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:02.451891   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:02.949546   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:02.949568   37081 round_trippers.go:469] Request Headers:
I0717 17:45:02.949576   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:02.949580   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:02.951444   37081 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
I0717 17:45:03.450199   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:03.450223   37081 round_trippers.go:469] Request Headers:
I0717 17:45:03.450231   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:03.450235   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:03.452453   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:03.452559   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:45:03.950197   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:03.950218   37081 round_trippers.go:469] Request Headers:
I0717 17:45:03.950226   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:03.950230   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:03.952488   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:04.450076   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:04.450102   37081 round_trippers.go:469] Request Headers:
I0717 17:45:04.450112   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:04.450137   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:04.452431   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:04.950259   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:04.950282   37081 round_trippers.go:469] Request Headers:
I0717 17:45:04.950290   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:04.950294   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:04.952507   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:05.450157   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:05.450180   37081 round_trippers.go:469] Request Headers:
I0717 17:45:05.450191   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:05.450196   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:05.452443   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:05.950194   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:05.950218   37081 round_trippers.go:469] Request Headers:
I0717 17:45:05.950230   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:05.950238   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:05.952374   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:05.952471   37081 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
I0717 17:45:06.450143   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:06.450172   37081 round_trippers.go:469] Request Headers:
I0717 17:45:06.450187   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:06.450193   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:06.452832   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:06.949536   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:06.949561   37081 round_trippers.go:469] Request Headers:
I0717 17:45:06.949573   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:06.949578   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:06.951703   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:07.449364   37081 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
I0717 17:45:07.449385   37081 round_trippers.go:469] Request Headers:
I0717 17:45:07.449393   37081 round_trippers.go:473]     Accept: application/json, */*
I0717 17:45:07.449397   37081 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
I0717 17:45:07.452030   37081 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
I0717 17:45:07.949753   37081 node_ready.go:38] duration metric: took 4m0.000631344s for node "ha-333994-m02" to be "Ready" ...
I0717 17:45:07.951998   37081 out.go:177] 
W0717 17:45:07.953395   37081 out.go:239] X Exiting due to GUEST_NODE_START: failed to start node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
X Exiting due to GUEST_NODE_START: failed to start node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
W0717 17:45:07.953410   37081 out.go:239] * 
* 
W0717 17:45:07.955456   37081 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
│                                                                                             │
│    * If the above advice does not help, please let us know:                                 │
│      https://github.com/kubernetes/minikube/issues/new/choose                               │
│                                                                                             │
│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
│    * Please also attach the following file to the GitHub issue:                             │
│    * - /tmp/minikube_node_6a758bccf1d363a5d0799efcdea444172a621e97_0.log                    │
│                                                                                             │
╰─────────────────────────────────────────────────────────────────────────────────────────────╯
╭─────────────────────────────────────────────────────────────────────────────────────────────╮
│                                                                                             │
│    * If the above advice does not help, please let us know:                                 │
│      https://github.com/kubernetes/minikube/issues/new/choose                               │
│                                                                                             │
│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
│    * Please also attach the following file to the GitHub issue:                             │
│    * - /tmp/minikube_node_6a758bccf1d363a5d0799efcdea444172a621e97_0.log                    │
│                                                                                             │
╰─────────────────────────────────────────────────────────────────────────────────────────────╯
I0717 17:45:07.957002   37081 out.go:177] 
ha_test.go:423: secondary control-plane node start returned an error. args "out/minikube-linux-amd64 -p ha-333994 node start m02 -v=7 --alsologtostderr": exit status 80
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (602.611773ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:08.196562   38047 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:08.197191   38047 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:08.197200   38047 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:08.197205   38047 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:08.197379   38047 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:08.197531   38047 out.go:298] Setting JSON to false
	I0717 17:45:08.197557   38047 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:08.197677   38047 notify.go:220] Checking for updates...
	I0717 17:45:08.197890   38047 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:08.197904   38047 status.go:255] checking status of ha-333994 ...
	I0717 17:45:08.198287   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.198329   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.218472   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45129
	I0717 17:45:08.218941   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.219715   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.219744   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.220150   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.220340   38047 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:08.221898   38047 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:08.221915   38047 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:08.222218   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.222251   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.237237   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40693
	I0717 17:45:08.237634   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.238102   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.238138   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.238468   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.238652   38047 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:08.241371   38047 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:08.241765   38047 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:08.241791   38047 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:08.241958   38047 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:08.242309   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.242347   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.256675   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37437
	I0717 17:45:08.257485   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.257940   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.257962   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.258277   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.258463   38047 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:08.258639   38047 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:08.258660   38047 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:08.261311   38047 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:08.261707   38047 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:08.261736   38047 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:08.261890   38047 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:08.262073   38047 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:08.262228   38047 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:08.262362   38047 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:08.350960   38047 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:08.357784   38047 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:08.375406   38047 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:08.375438   38047 api_server.go:166] Checking apiserver status ...
	I0717 17:45:08.375475   38047 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:08.395896   38047 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:08.406227   38047 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:08.406280   38047 ssh_runner.go:195] Run: ls
	I0717 17:45:08.410392   38047 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:08.414546   38047 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:08.414567   38047 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:08.414585   38047 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:08.414604   38047 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:08.414993   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.415036   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.430216   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33223
	I0717 17:45:08.430648   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.431067   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.431086   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.431411   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.431593   38047 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:08.433124   38047 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:08.433138   38047 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:08.433397   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.433425   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.448263   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37907
	I0717 17:45:08.448618   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.449015   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.449038   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.449296   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.449446   38047 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:08.451991   38047 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:08.452342   38047 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:08.452373   38047 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:08.452437   38047 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:08.452715   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.452747   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.467534   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41863
	I0717 17:45:08.467916   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.468373   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.468396   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.468676   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.468849   38047 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:08.469049   38047 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:08.469076   38047 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:08.471671   38047 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:08.472084   38047 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:08.472120   38047 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:08.472235   38047 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:08.472387   38047 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:08.472523   38047 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:08.472632   38047 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:08.550506   38047 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:08.570619   38047 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:08.570660   38047 api_server.go:166] Checking apiserver status ...
	I0717 17:45:08.570709   38047 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:08.590133   38047 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:08.590158   38047 status.go:422] ha-333994-m02 apiserver status = Running (err=<nil>)
	I0717 17:45:08.590167   38047 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Running APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:08.590181   38047 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:08.590522   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.590561   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.605831   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39533
	I0717 17:45:08.606244   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.606686   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.606707   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.607042   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.607238   38047 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:08.608743   38047 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:08.608760   38047 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:08.609672   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.609734   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.624290   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37117
	I0717 17:45:08.624714   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.625140   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.625166   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.625412   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.625562   38047 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:08.628394   38047 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:08.628817   38047 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:08.628855   38047 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:08.628988   38047 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:08.629361   38047 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:08.629407   38047 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:08.644525   38047 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37943
	I0717 17:45:08.644926   38047 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:08.645354   38047 main.go:141] libmachine: Using API Version  1
	I0717 17:45:08.645371   38047 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:08.645629   38047 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:08.645822   38047 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:08.645985   38047 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:08.646007   38047 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:08.648626   38047 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:08.649045   38047 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:08.649076   38047 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:08.649184   38047 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:08.649344   38047 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:08.649484   38047 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:08.649607   38047 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:08.731304   38047 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:08.748466   38047 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (566.638484ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:09.328573   38114 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:09.328773   38114 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:09.328781   38114 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:09.328785   38114 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:09.328943   38114 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:09.329106   38114 out.go:298] Setting JSON to false
	I0717 17:45:09.329132   38114 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:09.329180   38114 notify.go:220] Checking for updates...
	I0717 17:45:09.329462   38114 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:09.329473   38114 status.go:255] checking status of ha-333994 ...
	I0717 17:45:09.329800   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.329848   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.350845   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32957
	I0717 17:45:09.351332   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.351870   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.351886   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.352237   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.352464   38114 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:09.354097   38114 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:09.354131   38114 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:09.354430   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.354468   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.370687   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43711
	I0717 17:45:09.371028   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.371500   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.371518   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.371809   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.372001   38114 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:09.374938   38114 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:09.375294   38114 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:09.375320   38114 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:09.375472   38114 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:09.375850   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.375892   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.390545   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41915
	I0717 17:45:09.390967   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.391464   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.391485   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.391930   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.392144   38114 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:09.392346   38114 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:09.392362   38114 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:09.395359   38114 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:09.395819   38114 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:09.395843   38114 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:09.396034   38114 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:09.396206   38114 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:09.396374   38114 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:09.396519   38114 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:09.482240   38114 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:09.489439   38114 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:09.504623   38114 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:09.504648   38114 api_server.go:166] Checking apiserver status ...
	I0717 17:45:09.504679   38114 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:09.519083   38114 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:09.529602   38114 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:09.529663   38114 ssh_runner.go:195] Run: ls
	I0717 17:45:09.534308   38114 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:09.541701   38114 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:09.541732   38114 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:09.541744   38114 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:09.541765   38114 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:09.542197   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.542242   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.557321   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41397
	I0717 17:45:09.557761   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.558288   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.558304   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.558665   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.558880   38114 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:09.560447   38114 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:09.560463   38114 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:09.560769   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.560806   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.575782   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36329
	I0717 17:45:09.576147   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.576589   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.576603   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.576936   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.577105   38114 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:09.579870   38114 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:09.580277   38114 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:09.580299   38114 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:09.580465   38114 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:09.580751   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.580786   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.596531   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44957
	I0717 17:45:09.596977   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.597469   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.597492   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.597787   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.597955   38114 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:09.598181   38114 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:09.598200   38114 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:09.601147   38114 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:09.601566   38114 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:09.601593   38114 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:09.601733   38114 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:09.601883   38114 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:09.602015   38114 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:09.602138   38114 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:09.677289   38114 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:09.692235   38114 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:09.692259   38114 api_server.go:166] Checking apiserver status ...
	I0717 17:45:09.692290   38114 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:09.705097   38114 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:09.705117   38114 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:45:09.705128   38114 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:09.705147   38114 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:09.705558   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.705600   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.721050   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43055
	I0717 17:45:09.721433   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.721896   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.721912   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.722190   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.722358   38114 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:09.723684   38114 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:09.723700   38114 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:09.723993   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.724021   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.738735   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41349
	I0717 17:45:09.739134   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.739584   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.739635   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.739901   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.740061   38114 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:09.742452   38114 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:09.742735   38114 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:09.742770   38114 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:09.742959   38114 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:09.743357   38114 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:09.743395   38114 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:09.758618   38114 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42473
	I0717 17:45:09.758978   38114 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:09.759398   38114 main.go:141] libmachine: Using API Version  1
	I0717 17:45:09.759420   38114 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:09.759684   38114 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:09.759857   38114 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:09.759996   38114 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:09.760011   38114 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:09.762473   38114 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:09.762893   38114 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:09.762925   38114 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:09.763065   38114 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:09.763233   38114 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:09.763369   38114 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:09.763505   38114 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:09.841717   38114 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:09.855420   38114 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (567.919874ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:12.136243   38197 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:12.136349   38197 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:12.136358   38197 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:12.136364   38197 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:12.136543   38197 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:12.136712   38197 out.go:298] Setting JSON to false
	I0717 17:45:12.136745   38197 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:12.136846   38197 notify.go:220] Checking for updates...
	I0717 17:45:12.137135   38197 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:12.137150   38197 status.go:255] checking status of ha-333994 ...
	I0717 17:45:12.137524   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.137639   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.157993   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33643
	I0717 17:45:12.158409   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.158925   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.158944   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.159222   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.159423   38197 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:12.160929   38197 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:12.160943   38197 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:12.161256   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.161293   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.175549   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34557
	I0717 17:45:12.175890   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.176307   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.176325   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.176657   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.176865   38197 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:12.179452   38197 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:12.179848   38197 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:12.179883   38197 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:12.180051   38197 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:12.180358   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.180401   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.195064   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41257
	I0717 17:45:12.195511   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.196008   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.196036   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.196321   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.196496   38197 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:12.196738   38197 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:12.196771   38197 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:12.199156   38197 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:12.199557   38197 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:12.199584   38197 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:12.199715   38197 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:12.199898   38197 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:12.200034   38197 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:12.200175   38197 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:12.286030   38197 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:12.292099   38197 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:12.307111   38197 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:12.307149   38197 api_server.go:166] Checking apiserver status ...
	I0717 17:45:12.307192   38197 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:12.321252   38197 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:12.331057   38197 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:12.331135   38197 ssh_runner.go:195] Run: ls
	I0717 17:45:12.337047   38197 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:12.343044   38197 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:12.343068   38197 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:12.343077   38197 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:12.343094   38197 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:12.343422   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.343463   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.359253   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34451
	I0717 17:45:12.359674   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.360257   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.360277   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.360622   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.360800   38197 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:12.362378   38197 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:12.362393   38197 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:12.362680   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.362720   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.377087   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42435
	I0717 17:45:12.377594   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.378153   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.378178   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.378492   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.378674   38197 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:12.381061   38197 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:12.381518   38197 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:12.381553   38197 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:12.381719   38197 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:12.382007   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.382041   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.397440   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34117
	I0717 17:45:12.397891   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.398334   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.398359   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.398658   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.398851   38197 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:12.399015   38197 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:12.399031   38197 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:12.402409   38197 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:12.402801   38197 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:12.402837   38197 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:12.402970   38197 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:12.403108   38197 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:12.403231   38197 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:12.403488   38197 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:12.481342   38197 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:12.497564   38197 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:12.497592   38197 api_server.go:166] Checking apiserver status ...
	I0717 17:45:12.497631   38197 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:12.511940   38197 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:12.511966   38197 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:45:12.511974   38197 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:12.511998   38197 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:12.512294   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.512337   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.527840   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33085
	I0717 17:45:12.528227   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.528693   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.528711   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.528980   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.529168   38197 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:12.530730   38197 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:12.530748   38197 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:12.531021   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.531053   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.546643   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46467
	I0717 17:45:12.547031   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.547453   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.547478   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.547770   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.547982   38197 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:12.550443   38197 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:12.550815   38197 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:12.550842   38197 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:12.550999   38197 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:12.551319   38197 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:12.551358   38197 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:12.566624   38197 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41399
	I0717 17:45:12.567006   38197 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:12.567450   38197 main.go:141] libmachine: Using API Version  1
	I0717 17:45:12.567472   38197 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:12.567771   38197 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:12.567971   38197 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:12.568149   38197 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:12.568167   38197 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:12.570838   38197 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:12.571288   38197 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:12.571315   38197 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:12.571452   38197 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:12.571616   38197 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:12.571771   38197 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:12.571907   38197 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:12.649945   38197 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:12.663563   38197 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (580.522386ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:14.245163   38263 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:14.245406   38263 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:14.245414   38263 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:14.245418   38263 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:14.245601   38263 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:14.245768   38263 out.go:298] Setting JSON to false
	I0717 17:45:14.245793   38263 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:14.245914   38263 notify.go:220] Checking for updates...
	I0717 17:45:14.246157   38263 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:14.246172   38263 status.go:255] checking status of ha-333994 ...
	I0717 17:45:14.246562   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.246614   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.264616   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40995
	I0717 17:45:14.265056   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.265638   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.265665   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.266194   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.266405   38263 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:14.268007   38263 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:14.268022   38263 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:14.268309   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.268338   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.283140   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37989
	I0717 17:45:14.283492   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.283940   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.283965   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.284241   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.284408   38263 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:14.286888   38263 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:14.287265   38263 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:14.287294   38263 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:14.287402   38263 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:14.287773   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.287813   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.302039   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46055
	I0717 17:45:14.302440   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.302862   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.302883   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.303178   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.303368   38263 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:14.303585   38263 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:14.303613   38263 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:14.305902   38263 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:14.306233   38263 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:14.306262   38263 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:14.306434   38263 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:14.306624   38263 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:14.306776   38263 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:14.306893   38263 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:14.398611   38263 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:14.405386   38263 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:14.424137   38263 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:14.424166   38263 api_server.go:166] Checking apiserver status ...
	I0717 17:45:14.424205   38263 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:14.442074   38263 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:14.454310   38263 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:14.454399   38263 ssh_runner.go:195] Run: ls
	I0717 17:45:14.459035   38263 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:14.463510   38263 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:14.463532   38263 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:14.463541   38263 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:14.463557   38263 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:14.463980   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.464029   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.479400   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42207
	I0717 17:45:14.479881   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.480341   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.480359   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.480687   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.480870   38263 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:14.482289   38263 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:14.482307   38263 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:14.482590   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.482620   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.497038   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35681
	I0717 17:45:14.497885   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.498512   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.498559   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.498867   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.499053   38263 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:14.501751   38263 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:14.502142   38263 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:14.502166   38263 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:14.502286   38263 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:14.502605   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.502639   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.517609   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42481
	I0717 17:45:14.518038   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.518502   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.518532   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.518846   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.519010   38263 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:14.519168   38263 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:14.519187   38263 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:14.521856   38263 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:14.522321   38263 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:14.522340   38263 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:14.522488   38263 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:14.522638   38263 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:14.522798   38263 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:14.522907   38263 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:14.602536   38263 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:14.618217   38263 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:14.618246   38263 api_server.go:166] Checking apiserver status ...
	I0717 17:45:14.618283   38263 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:14.633221   38263 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:14.633243   38263 status.go:422] ha-333994-m02 apiserver status = Running (err=<nil>)
	I0717 17:45:14.633255   38263 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Running APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:14.633285   38263 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:14.633667   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.633703   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.649110   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44691
	I0717 17:45:14.649502   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.649960   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.649990   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.650305   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.650494   38263 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:14.652093   38263 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:14.652110   38263 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:14.652427   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.652463   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.667242   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36297
	I0717 17:45:14.667674   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.668122   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.668142   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.668469   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.668640   38263 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:14.671079   38263 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:14.671430   38263 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:14.671458   38263 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:14.671573   38263 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:14.671865   38263 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:14.671896   38263 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:14.686985   38263 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43869
	I0717 17:45:14.687369   38263 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:14.687793   38263 main.go:141] libmachine: Using API Version  1
	I0717 17:45:14.687811   38263 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:14.688108   38263 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:14.688278   38263 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:14.688436   38263 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:14.688454   38263 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:14.691073   38263 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:14.691455   38263 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:14.691480   38263 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:14.691597   38263 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:14.691770   38263 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:14.691902   38263 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:14.692010   38263 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:14.769535   38263 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:14.783496   38263 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (584.396346ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:18.025943   38345 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:18.026055   38345 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:18.026064   38345 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:18.026068   38345 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:18.026276   38345 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:18.026439   38345 out.go:298] Setting JSON to false
	I0717 17:45:18.026467   38345 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:18.026514   38345 notify.go:220] Checking for updates...
	I0717 17:45:18.026800   38345 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:18.026814   38345 status.go:255] checking status of ha-333994 ...
	I0717 17:45:18.027136   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.027190   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.045387   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38895
	I0717 17:45:18.045884   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.046509   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.046540   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.046882   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.047064   38345 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:18.048567   38345 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:18.048584   38345 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:18.048878   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.048909   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.063927   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37807
	I0717 17:45:18.064432   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.064930   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.064953   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.065279   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.065431   38345 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:18.068145   38345 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:18.068564   38345 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:18.068608   38345 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:18.068762   38345 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:18.069045   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.069086   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.084019   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45511
	I0717 17:45:18.084342   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.084778   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.084801   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.085097   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.085283   38345 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:18.085472   38345 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:18.085508   38345 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:18.088296   38345 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:18.088666   38345 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:18.088698   38345 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:18.088781   38345 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:18.088964   38345 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:18.089135   38345 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:18.089266   38345 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:18.178179   38345 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:18.185632   38345 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:18.202207   38345 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:18.202234   38345 api_server.go:166] Checking apiserver status ...
	I0717 17:45:18.202282   38345 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:18.221286   38345 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:18.235045   38345 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:18.235107   38345 ssh_runner.go:195] Run: ls
	I0717 17:45:18.240562   38345 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:18.245302   38345 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:18.245331   38345 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:18.245342   38345 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:18.245363   38345 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:18.245654   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.245700   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.261964   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38459
	I0717 17:45:18.262362   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.262896   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.262930   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.263280   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.263454   38345 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:18.264942   38345 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:18.264961   38345 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:18.265266   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.265327   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.280883   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33561
	I0717 17:45:18.281330   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.281887   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.281914   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.282291   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.282484   38345 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:18.285141   38345 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:18.285563   38345 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:18.285583   38345 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:18.285743   38345 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:18.286062   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.286099   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.301046   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38279
	I0717 17:45:18.301467   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.301904   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.301929   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.302252   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.302460   38345 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:18.302646   38345 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:18.302665   38345 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:18.305556   38345 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:18.305974   38345 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:18.306003   38345 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:18.306148   38345 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:18.306323   38345 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:18.306483   38345 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:18.306630   38345 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:18.393360   38345 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:18.408343   38345 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:18.408369   38345 api_server.go:166] Checking apiserver status ...
	I0717 17:45:18.408399   38345 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:18.420448   38345 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:18.420468   38345 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:45:18.420476   38345 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:18.420506   38345 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:18.420828   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.420863   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.436577   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42375
	I0717 17:45:18.436982   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.437432   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.437450   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.437755   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.437920   38345 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:18.439544   38345 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:18.439562   38345 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:18.439994   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.440037   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.455422   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34991
	I0717 17:45:18.455818   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.456339   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.456367   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.456672   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.456854   38345 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:18.459574   38345 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:18.459970   38345 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:18.459995   38345 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:18.460166   38345 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:18.460451   38345 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:18.460490   38345 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:18.475318   38345 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37459
	I0717 17:45:18.475737   38345 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:18.476144   38345 main.go:141] libmachine: Using API Version  1
	I0717 17:45:18.476163   38345 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:18.476417   38345 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:18.476558   38345 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:18.476732   38345 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:18.476748   38345 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:18.479160   38345 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:18.479546   38345 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:18.479580   38345 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:18.479736   38345 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:18.479930   38345 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:18.480116   38345 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:18.480264   38345 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:18.557334   38345 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:18.571176   38345 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (562.431298ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:22.908354   38427 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:22.908466   38427 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:22.908475   38427 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:22.908479   38427 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:22.908757   38427 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:22.908969   38427 out.go:298] Setting JSON to false
	I0717 17:45:22.909001   38427 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:22.909057   38427 notify.go:220] Checking for updates...
	I0717 17:45:22.909494   38427 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:22.909513   38427 status.go:255] checking status of ha-333994 ...
	I0717 17:45:22.910021   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:22.910068   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:22.926740   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44247
	I0717 17:45:22.927128   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:22.927692   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:22.927715   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:22.928107   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:22.928288   38427 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:22.929914   38427 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:22.929930   38427 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:22.930268   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:22.930319   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:22.947820   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39353
	I0717 17:45:22.948359   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:22.948821   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:22.948846   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:22.949176   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:22.949379   38427 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:22.952362   38427 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:22.952792   38427 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:22.952824   38427 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:22.952979   38427 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:22.953281   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:22.953315   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:22.967979   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35911
	I0717 17:45:22.968397   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:22.968845   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:22.968871   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:22.969169   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:22.969349   38427 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:22.969535   38427 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:22.969557   38427 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:22.972364   38427 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:22.972724   38427 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:22.972761   38427 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:22.972874   38427 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:22.973048   38427 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:22.973177   38427 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:22.973295   38427 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:23.057495   38427 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:23.063541   38427 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:23.079685   38427 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:23.079729   38427 api_server.go:166] Checking apiserver status ...
	I0717 17:45:23.079765   38427 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:23.095295   38427 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:23.105429   38427 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:23.105505   38427 ssh_runner.go:195] Run: ls
	I0717 17:45:23.109989   38427 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:23.114319   38427 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:23.114341   38427 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:23.114351   38427 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:23.114371   38427 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:23.114676   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:23.114712   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:23.129375   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39507
	I0717 17:45:23.129769   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:23.130239   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:23.130265   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:23.130581   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:23.130753   38427 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:23.132237   38427 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:23.132257   38427 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:23.132633   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:23.132675   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:23.147322   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38583
	I0717 17:45:23.147670   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:23.148119   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:23.148147   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:23.148455   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:23.148636   38427 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:23.151127   38427 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:23.151541   38427 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:23.151566   38427 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:23.151612   38427 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:23.151906   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:23.151944   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:23.166199   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37795
	I0717 17:45:23.166608   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:23.167059   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:23.167082   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:23.167376   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:23.167556   38427 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:23.167752   38427 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:23.167775   38427 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:23.170162   38427 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:23.170573   38427 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:23.170600   38427 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:23.170686   38427 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:23.170842   38427 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:23.170989   38427 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:23.171127   38427 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:23.249968   38427 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:23.264864   38427 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:23.264889   38427 api_server.go:166] Checking apiserver status ...
	I0717 17:45:23.264921   38427 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:23.277026   38427 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:23.277044   38427 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:45:23.277052   38427 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:23.277067   38427 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:23.277389   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:23.277427   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:23.292602   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45031
	I0717 17:45:23.292978   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:23.293490   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:23.293507   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:23.293818   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:23.294047   38427 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:23.295618   38427 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:23.295633   38427 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:23.296000   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:23.296043   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:23.310766   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37211
	I0717 17:45:23.311123   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:23.311546   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:23.311564   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:23.311856   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:23.312023   38427 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:23.314768   38427 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:23.315109   38427 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:23.315151   38427 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:23.315265   38427 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:23.315645   38427 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:23.315699   38427 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:23.330583   38427 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40005
	I0717 17:45:23.331051   38427 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:23.331491   38427 main.go:141] libmachine: Using API Version  1
	I0717 17:45:23.331511   38427 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:23.331771   38427 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:23.331960   38427 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:23.332129   38427 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:23.332151   38427 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:23.334633   38427 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:23.335004   38427 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:23.335033   38427 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:23.335167   38427 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:23.335325   38427 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:23.335450   38427 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:23.335565   38427 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:23.413792   38427 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:23.428683   38427 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (556.718589ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:34.364266   38525 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:34.364599   38525 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:34.364613   38525 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:34.364619   38525 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:34.364902   38525 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:34.365109   38525 out.go:298] Setting JSON to false
	I0717 17:45:34.365134   38525 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:34.365240   38525 notify.go:220] Checking for updates...
	I0717 17:45:34.365477   38525 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:34.365492   38525 status.go:255] checking status of ha-333994 ...
	I0717 17:45:34.365852   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.365888   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.383875   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34661
	I0717 17:45:34.384318   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.384953   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.384979   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.385337   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.385553   38525 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:34.387194   38525 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:34.387211   38525 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:34.387514   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.387565   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.401737   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34341
	I0717 17:45:34.402075   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.402481   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.402503   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.402764   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.402942   38525 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:34.405376   38525 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:34.405781   38525 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:34.405818   38525 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:34.405923   38525 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:34.406251   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.406299   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.420560   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44119
	I0717 17:45:34.420993   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.421424   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.421447   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.421742   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.421900   38525 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:34.422102   38525 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:34.422140   38525 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:34.424727   38525 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:34.425112   38525 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:34.425135   38525 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:34.425282   38525 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:34.425459   38525 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:34.425592   38525 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:34.425720   38525 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:34.509606   38525 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:34.515382   38525 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:34.539754   38525 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:34.539786   38525 api_server.go:166] Checking apiserver status ...
	I0717 17:45:34.539819   38525 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:34.554774   38525 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:34.566224   38525 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:34.566278   38525 ssh_runner.go:195] Run: ls
	I0717 17:45:34.570510   38525 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:34.574535   38525 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:34.574556   38525 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:34.574567   38525 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:34.574587   38525 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:34.574974   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.575013   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.589196   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34195
	I0717 17:45:34.589570   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.590036   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.590054   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.590355   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.590562   38525 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:34.591948   38525 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:34.591964   38525 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:34.592229   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.592258   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.606163   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35111
	I0717 17:45:34.606560   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.606962   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.606986   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.607284   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.607465   38525 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:34.609855   38525 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:34.610252   38525 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:34.610287   38525 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:34.610441   38525 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:34.610720   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.610752   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.624838   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35523
	I0717 17:45:34.625249   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.625751   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.625770   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.626039   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.626226   38525 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:34.626405   38525 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:34.626424   38525 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:34.629277   38525 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:34.629593   38525 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:34.629617   38525 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:34.629709   38525 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:34.629861   38525 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:34.630018   38525 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:34.630150   38525 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:34.705245   38525 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:34.719239   38525 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:34.719261   38525 api_server.go:166] Checking apiserver status ...
	I0717 17:45:34.719298   38525 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:34.731529   38525 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:34.731549   38525 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:45:34.731557   38525 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:34.731573   38525 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:34.731891   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.731933   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.746362   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43629
	I0717 17:45:34.746754   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.747183   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.747202   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.747479   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.747650   38525 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:34.749131   38525 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:34.749149   38525 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:34.749419   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.749449   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.763415   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37001
	I0717 17:45:34.763798   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.764301   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.764323   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.764618   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.764791   38525 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:34.767251   38525 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:34.767623   38525 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:34.767647   38525 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:34.767791   38525 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:34.768085   38525 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:34.768117   38525 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:34.782233   38525 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40647
	I0717 17:45:34.782622   38525 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:34.783079   38525 main.go:141] libmachine: Using API Version  1
	I0717 17:45:34.783100   38525 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:34.783358   38525 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:34.783548   38525 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:34.783713   38525 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:34.783734   38525 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:34.786373   38525 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:34.786766   38525 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:34.786795   38525 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:34.786970   38525 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:34.787154   38525 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:34.787298   38525 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:34.787437   38525 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:34.865572   38525 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:34.879541   38525 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (553.900287ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:45:44.707232   38624 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:45:44.707456   38624 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:44.707464   38624 out.go:304] Setting ErrFile to fd 2...
	I0717 17:45:44.707468   38624 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:45:44.707640   38624 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:45:44.707847   38624 out.go:298] Setting JSON to false
	I0717 17:45:44.707876   38624 mustload.go:65] Loading cluster: ha-333994
	I0717 17:45:44.707922   38624 notify.go:220] Checking for updates...
	I0717 17:45:44.708267   38624 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:45:44.708284   38624 status.go:255] checking status of ha-333994 ...
	I0717 17:45:44.708693   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:44.708756   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:44.729317   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33959
	I0717 17:45:44.729785   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:44.730388   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:44.730410   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:44.730804   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:44.731001   38624 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:45:44.732637   38624 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:45:44.732650   38624 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:44.732926   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:44.732961   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:44.747429   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40077
	I0717 17:45:44.747811   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:44.748205   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:44.748225   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:44.748580   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:44.748764   38624 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:45:44.751355   38624 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:44.751714   38624 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:44.751745   38624 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:44.751890   38624 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:45:44.752165   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:44.752194   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:44.766747   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44195
	I0717 17:45:44.767171   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:44.767599   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:44.767619   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:44.767851   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:44.768040   38624 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:45:44.768212   38624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:44.768237   38624 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:45:44.770899   38624 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:44.771318   38624 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:45:44.771343   38624 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:45:44.771467   38624 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:45:44.771641   38624 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:45:44.771796   38624 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:45:44.771926   38624 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:45:44.857779   38624 ssh_runner.go:195] Run: systemctl --version
	I0717 17:45:44.863806   38624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:44.879440   38624 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:44.879472   38624 api_server.go:166] Checking apiserver status ...
	I0717 17:45:44.879519   38624 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:45:44.893837   38624 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:45:44.903724   38624 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:44.903783   38624 ssh_runner.go:195] Run: ls
	I0717 17:45:44.908220   38624 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:45:44.913470   38624 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:45:44.913495   38624 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:45:44.913506   38624 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:44.913529   38624 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:45:44.913828   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:44.913861   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:44.928190   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38603
	I0717 17:45:44.928539   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:44.929101   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:44.929119   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:44.929457   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:44.929636   38624 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:45:44.931013   38624 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:45:44.931028   38624 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:44.931320   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:44.931363   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:44.945183   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38419
	I0717 17:45:44.945621   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:44.946044   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:44.946064   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:44.946350   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:44.946513   38624 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:45:44.949172   38624 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:44.949541   38624 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:44.949573   38624 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:44.949692   38624 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:45:44.950028   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:44.950072   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:44.964204   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32781
	I0717 17:45:44.964569   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:44.964989   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:44.965010   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:44.965297   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:44.965480   38624 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:45:44.965667   38624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:44.965686   38624 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:45:44.968092   38624 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:44.968470   38624 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:45:44.968493   38624 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:45:44.968619   38624 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:45:44.968765   38624 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:45:44.968903   38624 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:45:44.969049   38624 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:45:45.045007   38624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:45.060560   38624 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:45:45.060583   38624 api_server.go:166] Checking apiserver status ...
	I0717 17:45:45.060612   38624 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:45:45.072939   38624 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:45:45.072957   38624 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:45:45.072968   38624 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:45:45.072996   38624 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:45:45.073282   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:45.073323   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:45.087806   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36923
	I0717 17:45:45.088184   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:45.088592   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:45.088614   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:45.088885   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:45.089065   38624 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:45:45.090487   38624 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:45:45.090501   38624 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:45.090781   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:45.090835   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:45.104776   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35369
	I0717 17:45:45.105187   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:45.105666   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:45.105683   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:45.105988   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:45.106173   38624 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:45:45.108865   38624 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:45.109232   38624 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:45.109251   38624 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:45.109359   38624 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:45:45.109631   38624 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:45:45.109668   38624 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:45:45.123671   38624 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35069
	I0717 17:45:45.124048   38624 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:45:45.124446   38624 main.go:141] libmachine: Using API Version  1
	I0717 17:45:45.124472   38624 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:45:45.124747   38624 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:45:45.124899   38624 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:45:45.125092   38624 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:45:45.125114   38624 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:45:45.127771   38624 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:45.128126   38624 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:45:45.128161   38624 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:45:45.128244   38624 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:45:45.128407   38624 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:45:45.128540   38624 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:45:45.128675   38624 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:45:45.205508   38624 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:45:45.220640   38624 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (569.193112ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-333994-m03
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:46:01.520354   38755 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:46:01.520450   38755 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:46:01.520458   38755 out.go:304] Setting ErrFile to fd 2...
	I0717 17:46:01.520462   38755 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:46:01.520631   38755 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:46:01.520783   38755 out.go:298] Setting JSON to false
	I0717 17:46:01.520808   38755 mustload.go:65] Loading cluster: ha-333994
	I0717 17:46:01.520841   38755 notify.go:220] Checking for updates...
	I0717 17:46:01.521151   38755 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:46:01.521164   38755 status.go:255] checking status of ha-333994 ...
	I0717 17:46:01.521495   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.521545   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.540332   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39067
	I0717 17:46:01.540735   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.541298   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.541321   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.541721   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.541911   38755 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:46:01.543616   38755 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:46:01.543632   38755 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:46:01.543955   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.544000   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.558391   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35805
	I0717 17:46:01.558769   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.559210   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.559242   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.559557   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.559696   38755 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:46:01.562297   38755 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:46:01.562711   38755 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:46:01.562743   38755 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:46:01.562849   38755 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:46:01.563180   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.563219   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.577097   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43453
	I0717 17:46:01.577446   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.577897   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.577914   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.578231   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.578416   38755 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:46:01.578595   38755 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:46:01.578620   38755 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:46:01.581220   38755 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:46:01.581579   38755 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:46:01.581596   38755 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:46:01.581732   38755 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:46:01.581884   38755 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:46:01.582038   38755 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:46:01.582184   38755 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:46:01.665567   38755 ssh_runner.go:195] Run: systemctl --version
	I0717 17:46:01.672326   38755 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:46:01.687268   38755 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:46:01.687294   38755 api_server.go:166] Checking apiserver status ...
	I0717 17:46:01.687331   38755 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:46:01.704544   38755 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup
	W0717 17:46:01.718511   38755 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1202/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:46:01.718569   38755 ssh_runner.go:195] Run: ls
	I0717 17:46:01.725195   38755 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:46:01.730232   38755 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:46:01.730257   38755 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:46:01.730269   38755 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:46:01.730291   38755 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:46:01.730571   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.730615   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.744991   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40619
	I0717 17:46:01.745353   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.745785   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.745810   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.746189   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.746383   38755 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:46:01.747844   38755 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:46:01.747857   38755 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:46:01.748126   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.748155   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.762443   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39483
	I0717 17:46:01.762847   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.763297   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.763318   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.763563   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.763724   38755 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:46:01.766244   38755 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:46:01.766630   38755 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:46:01.766655   38755 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:46:01.766810   38755 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:46:01.767091   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.767124   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.781164   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34491
	I0717 17:46:01.781576   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.781990   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.782009   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.782367   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.782552   38755 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:46:01.782730   38755 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:46:01.782752   38755 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:46:01.785241   38755 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:46:01.785629   38755 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:41:00 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:46:01.785654   38755 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:46:01.785780   38755 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:46:01.785944   38755 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:46:01.786093   38755 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:46:01.786250   38755 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:46:01.865282   38755 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:46:01.886072   38755 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:46:01.886095   38755 api_server.go:166] Checking apiserver status ...
	I0717 17:46:01.886153   38755 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:46:01.897745   38755 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:46:01.897769   38755 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:46:01.897780   38755 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:46:01.897804   38755 status.go:255] checking status of ha-333994-m03 ...
	I0717 17:46:01.898096   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.898158   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.913110   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38777
	I0717 17:46:01.913505   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.913939   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.913956   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.914280   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.914445   38755 main.go:141] libmachine: (ha-333994-m03) Calling .GetState
	I0717 17:46:01.915919   38755 status.go:330] ha-333994-m03 host status = "Running" (err=<nil>)
	I0717 17:46:01.915933   38755 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:46:01.916225   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.916255   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.930470   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40159
	I0717 17:46:01.930816   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.931243   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.931267   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.931577   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.931755   38755 main.go:141] libmachine: (ha-333994-m03) Calling .GetIP
	I0717 17:46:01.934512   38755 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:46:01.934912   38755 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:46:01.934937   38755 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:46:01.935069   38755 host.go:66] Checking if "ha-333994-m03" exists ...
	I0717 17:46:01.935340   38755 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:46:01.935371   38755 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:46:01.949458   38755 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44225
	I0717 17:46:01.949825   38755 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:46:01.950266   38755 main.go:141] libmachine: Using API Version  1
	I0717 17:46:01.950286   38755 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:46:01.950571   38755 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:46:01.950754   38755 main.go:141] libmachine: (ha-333994-m03) Calling .DriverName
	I0717 17:46:01.950908   38755 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:46:01.950925   38755 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHHostname
	I0717 17:46:01.953474   38755 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:46:01.953854   38755 main.go:141] libmachine: (ha-333994-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:4b:0e:98", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:38:56 +0000 UTC Type:0 Mac:52:54:00:4b:0e:98 Iaid: IPaddr:192.168.39.197 Prefix:24 Hostname:ha-333994-m03 Clientid:01:52:54:00:4b:0e:98}
	I0717 17:46:01.953897   38755 main.go:141] libmachine: (ha-333994-m03) DBG | domain ha-333994-m03 has defined IP address 192.168.39.197 and MAC address 52:54:00:4b:0e:98 in network mk-ha-333994
	I0717 17:46:01.954086   38755 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHPort
	I0717 17:46:01.954251   38755 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHKeyPath
	I0717 17:46:01.954379   38755 main.go:141] libmachine: (ha-333994-m03) Calling .GetSSHUsername
	I0717 17:46:01.954500   38755 sshutil.go:53] new ssh client: &{IP:192.168.39.197 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m03/id_rsa Username:docker}
	I0717 17:46:02.033556   38755 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:46:02.047424   38755 status.go:257] ha-333994-m03 status: &{Name:ha-333994-m03 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:432: failed to run minikube status. args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/RestartSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/RestartSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.208308307s)
helpers_test.go:252: TestMultiControlPlane/serial/RestartSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node start m02 -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       18 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       19 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       19 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       19 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       19 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       19 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       19 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       19 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       19 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       19 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       19 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:45:58 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     19m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     19m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         19m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      19m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 19m                kube-proxy       
	  Normal  Starting                 19m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  19m (x4 over 19m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    19m (x4 over 19m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     19m (x3 over 19m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  19m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 19m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  19m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    19m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     19m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  19m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           19m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                19m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:46:02 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      5m48s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m48s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 5m43s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  5m48s (x2 over 5m48s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    5m48s (x2 over 5m48s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m48s (x2 over 5m48s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m48s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m44s                  node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                5m29s                  kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:41:11.077099Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":1506}
	{"level":"info","ts":"2024-07-17T17:41:11.08271Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":1506,"took":"4.803656ms","hash":4135639207,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2002944,"current-db-size-in-use":"2.0 MB"}
	{"level":"info","ts":"2024-07-17T17:41:11.082934Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":4135639207,"revision":1506,"compact-revision":967}
	
	
	==> kernel <==
	 17:46:03 up 20 min,  0 users,  load average: 0.28, 0.26, 0.19
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:44:56.593925       1 main.go:303] handling current node
	I0717 17:45:06.601903       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:06.602313       1 main.go:303] handling current node
	I0717 17:45:06.602450       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:06.602539       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:16.599330       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:16.599416       1 main.go:303] handling current node
	I0717 17:45:16.599444       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:16.599450       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:26.593296       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:26.593343       1 main.go:303] handling current node
	I0717 17:45:26.593373       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:26.593378       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:36.593155       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:36.593308       1 main.go:303] handling current node
	I0717 17:45:36.593330       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:36.593336       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:46.602244       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:46.602295       1 main.go:303] handling current node
	I0717 17:45:46.602314       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:46.602320       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:56.593212       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:56.593261       1 main.go:303] handling current node
	I0717 17:45:56.593285       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:56.593291       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:41:16 ha-333994 kubelet[1321]: E0717 17:41:16.469006    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:41:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:41:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:41:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:41:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:42:16 ha-333994 kubelet[1321]: E0717 17:42:16.469497    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:42:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:42:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:42:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:42:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:43:16 ha-333994 kubelet[1321]: E0717 17:43:16.470172    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:43:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:43:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:43:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:43:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:44:16 ha-333994 kubelet[1321]: E0717 17:44:16.472787    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:44:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:44:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:44:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:44:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:45:16 ha-333994 kubelet[1321]: E0717 17:45:16.469762    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:45:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:45:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:45:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:45:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/RestartSecondaryNode]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  8m47s (x3 over 18m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  17s (x3 over 5m29s)  default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/RestartSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartSecondaryNode (314.25s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (2.22s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
ha_test.go:304: expected profile "ha-333994" in json of 'profile list' to include 4 nodes but have 3 nodes. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-333994\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-333994\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"kvm2\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":1,\"APIServerPor
t\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-333994\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.168.39.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"containerd\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.168.39.180\",\"Port\":8443,\"KubernetesVers
ion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.168.39.127\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.168.39.197\",\"Port\":0,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":
false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/home/jenkins:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\"
:false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-linux-amd64 profile list --output json"
ha_test.go:307: expected profile "ha-333994" in json of 'profile list' to have "HAppy" status but have "Stopped" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-333994\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-333994\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"kvm2\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":1,\
"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-333994\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.168.39.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"containerd\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.168.39.180\",\"Port\":8443,\"K
ubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.168.39.127\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.168.39.197\",\"Port\":0,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-dev
ice-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/home/jenkins:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOp
timizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-linux-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
E0717 17:46:04.846459   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.168494733s)
helpers_test.go:252: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node start m02 -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:25:37
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:25:37.372173   31817 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:25:37.372300   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372309   31817 out.go:304] Setting ErrFile to fd 2...
	I0717 17:25:37.372316   31817 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:25:37.372515   31817 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:25:37.373068   31817 out.go:298] Setting JSON to false
	I0717 17:25:37.373934   31817 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4080,"bootTime":1721233057,"procs":177,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:25:37.373990   31817 start.go:139] virtualization: kvm guest
	I0717 17:25:37.376261   31817 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:25:37.377830   31817 notify.go:220] Checking for updates...
	I0717 17:25:37.377854   31817 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:25:37.379322   31817 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:25:37.380779   31817 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:25:37.382329   31817 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.383666   31817 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:25:37.384940   31817 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:25:37.386314   31817 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:25:37.420051   31817 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 17:25:37.421589   31817 start.go:297] selected driver: kvm2
	I0717 17:25:37.421607   31817 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:25:37.421618   31817 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:25:37.422327   31817 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.422404   31817 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:25:37.437115   31817 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:25:37.437156   31817 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:25:37.437363   31817 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:25:37.437413   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:25:37.437423   31817 cni.go:136] multinode detected (0 nodes found), recommending kindnet
	I0717 17:25:37.437432   31817 start_flags.go:319] Found "CNI" CNI - setting NetworkPlugin=cni
	I0717 17:25:37.437478   31817 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:container
d CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SS
HAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:25:37.437562   31817 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:25:37.439313   31817 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:25:37.440697   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:25:37.440738   31817 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:25:37.440745   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:25:37.440816   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:25:37.440827   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:25:37.441104   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:37.441121   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json: {Name:mk758d67ae5c79043a711460bac8ff59da52dd50 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:25:37.441235   31817 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:25:37.441263   31817 start.go:364] duration metric: took 16.553µs to acquireMachinesLock for "ha-333994"
	I0717 17:25:37.441278   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:25:37.441331   31817 start.go:125] createHost starting for "" (driver="kvm2")
	I0717 17:25:37.442904   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:25:37.443026   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:25:37.443066   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:25:37.456958   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46637
	I0717 17:25:37.457401   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:25:37.457924   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:25:37.457953   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:25:37.458234   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:25:37.458399   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:37.458508   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:37.458638   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:25:37.458664   31817 client.go:168] LocalClient.Create starting
	I0717 17:25:37.458690   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:25:37.458718   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458731   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458776   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:25:37.458792   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:25:37.458803   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:25:37.458817   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:25:37.458825   31817 main.go:141] libmachine: (ha-333994) Calling .PreCreateCheck
	I0717 17:25:37.459073   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:37.459495   31817 main.go:141] libmachine: Creating machine...
	I0717 17:25:37.459514   31817 main.go:141] libmachine: (ha-333994) Calling .Create
	I0717 17:25:37.459622   31817 main.go:141] libmachine: (ha-333994) Creating KVM machine...
	I0717 17:25:37.460734   31817 main.go:141] libmachine: (ha-333994) DBG | found existing default KVM network
	I0717 17:25:37.461376   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.461245   31840 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc00010f1e0}
	I0717 17:25:37.461396   31817 main.go:141] libmachine: (ha-333994) DBG | created network xml: 
	I0717 17:25:37.461405   31817 main.go:141] libmachine: (ha-333994) DBG | <network>
	I0717 17:25:37.461410   31817 main.go:141] libmachine: (ha-333994) DBG |   <name>mk-ha-333994</name>
	I0717 17:25:37.461416   31817 main.go:141] libmachine: (ha-333994) DBG |   <dns enable='no'/>
	I0717 17:25:37.461420   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461438   31817 main.go:141] libmachine: (ha-333994) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0717 17:25:37.461448   31817 main.go:141] libmachine: (ha-333994) DBG |     <dhcp>
	I0717 17:25:37.461459   31817 main.go:141] libmachine: (ha-333994) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0717 17:25:37.461473   31817 main.go:141] libmachine: (ha-333994) DBG |     </dhcp>
	I0717 17:25:37.461490   31817 main.go:141] libmachine: (ha-333994) DBG |   </ip>
	I0717 17:25:37.461499   31817 main.go:141] libmachine: (ha-333994) DBG |   
	I0717 17:25:37.461508   31817 main.go:141] libmachine: (ha-333994) DBG | </network>
	I0717 17:25:37.461513   31817 main.go:141] libmachine: (ha-333994) DBG | 
	I0717 17:25:37.467087   31817 main.go:141] libmachine: (ha-333994) DBG | trying to create private KVM network mk-ha-333994 192.168.39.0/24...
	I0717 17:25:37.530969   31817 main.go:141] libmachine: (ha-333994) DBG | private KVM network mk-ha-333994 192.168.39.0/24 created
	I0717 17:25:37.531012   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.530957   31840 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:37.531029   31817 main.go:141] libmachine: (ha-333994) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:37.531050   31817 main.go:141] libmachine: (ha-333994) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:25:37.531153   31817 main.go:141] libmachine: (ha-333994) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:25:37.769775   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:37.769643   31840 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa...
	I0717 17:25:38.127523   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127394   31840 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk...
	I0717 17:25:38.127548   31817 main.go:141] libmachine: (ha-333994) DBG | Writing magic tar header
	I0717 17:25:38.127558   31817 main.go:141] libmachine: (ha-333994) DBG | Writing SSH key tar header
	I0717 17:25:38.127566   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:38.127499   31840 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 ...
	I0717 17:25:38.127579   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994
	I0717 17:25:38.127621   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994 (perms=drwx------)
	I0717 17:25:38.127638   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:25:38.127649   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:25:38.127659   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:25:38.127674   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:25:38.127685   31817 main.go:141] libmachine: (ha-333994) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:25:38.127697   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:38.127708   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:25:38.127720   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:25:38.127729   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:25:38.127736   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:25:38.127763   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:25:38.127774   31817 main.go:141] libmachine: (ha-333994) DBG | Checking permissions on dir: /home
	I0717 17:25:38.127787   31817 main.go:141] libmachine: (ha-333994) DBG | Skipping /home - not owner
	I0717 17:25:38.128688   31817 main.go:141] libmachine: (ha-333994) define libvirt domain using xml: 
	I0717 17:25:38.128706   31817 main.go:141] libmachine: (ha-333994) <domain type='kvm'>
	I0717 17:25:38.128716   31817 main.go:141] libmachine: (ha-333994)   <name>ha-333994</name>
	I0717 17:25:38.128724   31817 main.go:141] libmachine: (ha-333994)   <memory unit='MiB'>2200</memory>
	I0717 17:25:38.128733   31817 main.go:141] libmachine: (ha-333994)   <vcpu>2</vcpu>
	I0717 17:25:38.128743   31817 main.go:141] libmachine: (ha-333994)   <features>
	I0717 17:25:38.128752   31817 main.go:141] libmachine: (ha-333994)     <acpi/>
	I0717 17:25:38.128758   31817 main.go:141] libmachine: (ha-333994)     <apic/>
	I0717 17:25:38.128768   31817 main.go:141] libmachine: (ha-333994)     <pae/>
	I0717 17:25:38.128788   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.128800   31817 main.go:141] libmachine: (ha-333994)   </features>
	I0717 17:25:38.128818   31817 main.go:141] libmachine: (ha-333994)   <cpu mode='host-passthrough'>
	I0717 17:25:38.128833   31817 main.go:141] libmachine: (ha-333994)   
	I0717 17:25:38.128844   31817 main.go:141] libmachine: (ha-333994)   </cpu>
	I0717 17:25:38.128854   31817 main.go:141] libmachine: (ha-333994)   <os>
	I0717 17:25:38.128867   31817 main.go:141] libmachine: (ha-333994)     <type>hvm</type>
	I0717 17:25:38.128878   31817 main.go:141] libmachine: (ha-333994)     <boot dev='cdrom'/>
	I0717 17:25:38.128890   31817 main.go:141] libmachine: (ha-333994)     <boot dev='hd'/>
	I0717 17:25:38.128901   31817 main.go:141] libmachine: (ha-333994)     <bootmenu enable='no'/>
	I0717 17:25:38.128927   31817 main.go:141] libmachine: (ha-333994)   </os>
	I0717 17:25:38.128949   31817 main.go:141] libmachine: (ha-333994)   <devices>
	I0717 17:25:38.128960   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='cdrom'>
	I0717 17:25:38.128974   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/boot2docker.iso'/>
	I0717 17:25:38.128988   31817 main.go:141] libmachine: (ha-333994)       <target dev='hdc' bus='scsi'/>
	I0717 17:25:38.128998   31817 main.go:141] libmachine: (ha-333994)       <readonly/>
	I0717 17:25:38.129007   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129031   31817 main.go:141] libmachine: (ha-333994)     <disk type='file' device='disk'>
	I0717 17:25:38.129043   31817 main.go:141] libmachine: (ha-333994)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:25:38.129057   31817 main.go:141] libmachine: (ha-333994)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/ha-333994.rawdisk'/>
	I0717 17:25:38.129067   31817 main.go:141] libmachine: (ha-333994)       <target dev='hda' bus='virtio'/>
	I0717 17:25:38.129079   31817 main.go:141] libmachine: (ha-333994)     </disk>
	I0717 17:25:38.129089   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129098   31817 main.go:141] libmachine: (ha-333994)       <source network='mk-ha-333994'/>
	I0717 17:25:38.129109   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129125   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129143   31817 main.go:141] libmachine: (ha-333994)     <interface type='network'>
	I0717 17:25:38.129156   31817 main.go:141] libmachine: (ha-333994)       <source network='default'/>
	I0717 17:25:38.129166   31817 main.go:141] libmachine: (ha-333994)       <model type='virtio'/>
	I0717 17:25:38.129177   31817 main.go:141] libmachine: (ha-333994)     </interface>
	I0717 17:25:38.129185   31817 main.go:141] libmachine: (ha-333994)     <serial type='pty'>
	I0717 17:25:38.129197   31817 main.go:141] libmachine: (ha-333994)       <target port='0'/>
	I0717 17:25:38.129212   31817 main.go:141] libmachine: (ha-333994)     </serial>
	I0717 17:25:38.129237   31817 main.go:141] libmachine: (ha-333994)     <console type='pty'>
	I0717 17:25:38.129257   31817 main.go:141] libmachine: (ha-333994)       <target type='serial' port='0'/>
	I0717 17:25:38.129277   31817 main.go:141] libmachine: (ha-333994)     </console>
	I0717 17:25:38.129288   31817 main.go:141] libmachine: (ha-333994)     <rng model='virtio'>
	I0717 17:25:38.129301   31817 main.go:141] libmachine: (ha-333994)       <backend model='random'>/dev/random</backend>
	I0717 17:25:38.129310   31817 main.go:141] libmachine: (ha-333994)     </rng>
	I0717 17:25:38.129321   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129333   31817 main.go:141] libmachine: (ha-333994)     
	I0717 17:25:38.129343   31817 main.go:141] libmachine: (ha-333994)   </devices>
	I0717 17:25:38.129353   31817 main.go:141] libmachine: (ha-333994) </domain>
	I0717 17:25:38.129364   31817 main.go:141] libmachine: (ha-333994) 
	I0717 17:25:38.133746   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:7d:ea:ab in network default
	I0717 17:25:38.134333   31817 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:25:38.134354   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:38.134949   31817 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:25:38.135204   31817 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:25:38.135633   31817 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:25:38.136245   31817 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:25:39.310815   31817 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:25:39.311620   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.312037   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.312090   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.312036   31840 retry.go:31] will retry after 308.80623ms: waiting for machine to come up
	I0717 17:25:39.622682   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.623065   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.623083   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.623047   31840 retry.go:31] will retry after 344.848861ms: waiting for machine to come up
	I0717 17:25:39.969533   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:39.969924   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:39.969950   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:39.969868   31840 retry.go:31] will retry after 339.149265ms: waiting for machine to come up
	I0717 17:25:40.310470   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.310889   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.310915   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.310855   31840 retry.go:31] will retry after 442.455692ms: waiting for machine to come up
	I0717 17:25:40.754326   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:40.754769   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:40.754793   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:40.754727   31840 retry.go:31] will retry after 692.369602ms: waiting for machine to come up
	I0717 17:25:41.448430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:41.448821   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:41.448845   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:41.448784   31840 retry.go:31] will retry after 888.634073ms: waiting for machine to come up
	I0717 17:25:42.338562   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:42.338956   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:42.338987   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:42.338917   31840 retry.go:31] will retry after 958.652231ms: waiting for machine to come up
	I0717 17:25:43.299646   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:43.300036   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:43.300060   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:43.299996   31840 retry.go:31] will retry after 1.026520774s: waiting for machine to come up
	I0717 17:25:44.328045   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:44.328353   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:44.328378   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:44.328319   31840 retry.go:31] will retry after 1.144606861s: waiting for machine to come up
	I0717 17:25:45.474485   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:45.474883   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:45.474908   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:45.474852   31840 retry.go:31] will retry after 2.320040547s: waiting for machine to come up
	I0717 17:25:47.796771   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:47.797227   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:47.797257   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:47.797189   31840 retry.go:31] will retry after 2.900412309s: waiting for machine to come up
	I0717 17:25:50.701258   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:50.701734   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:50.701785   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:50.701700   31840 retry.go:31] will retry after 2.901702791s: waiting for machine to come up
	I0717 17:25:53.605129   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:53.605559   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:25:53.605577   31817 main.go:141] libmachine: (ha-333994) DBG | I0717 17:25:53.605522   31840 retry.go:31] will retry after 3.63399522s: waiting for machine to come up
	I0717 17:25:57.240563   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.240970   31817 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:25:57.241006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.241016   31817 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:25:57.241422   31817 main.go:141] libmachine: (ha-333994) DBG | unable to find host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994
	I0717 17:25:57.311172   31817 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:25:57.311209   31817 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:25:57.311222   31817 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:25:57.313438   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313869   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:minikube Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.313914   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.313935   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:25:57.313972   31817 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:25:57.314013   31817 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:25:57.314051   31817 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:25:57.314064   31817 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:25:57.442005   31817 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:25:57.442249   31817 main.go:141] libmachine: (ha-333994) KVM machine creation complete!
	I0717 17:25:57.442580   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:57.443082   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443285   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:57.443431   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:25:57.443445   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:25:57.444683   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:25:57.444702   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:25:57.444710   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:25:57.444718   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.446779   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447118   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.447145   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.447285   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.447420   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447569   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.447686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.447850   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.448075   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.448086   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:25:57.561413   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.561435   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:25:57.561444   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.564006   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564331   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.564353   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.564530   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.564739   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.564886   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.565046   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.565213   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.565388   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.565402   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:25:57.678978   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:25:57.679062   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:25:57.679075   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:25:57.679085   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679397   31817 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:25:57.679418   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.679587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.682101   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682468   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.682497   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.682625   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.682902   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683088   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.683236   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.683384   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.683567   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.683582   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:25:57.808613   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:25:57.808643   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.811150   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811462   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.811484   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.811633   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:57.811819   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.811975   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:57.812114   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:57.812259   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:57.812470   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:57.812492   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:25:57.935982   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:25:57.936010   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:25:57.936045   31817 buildroot.go:174] setting up certificates
	I0717 17:25:57.936053   31817 provision.go:84] configureAuth start
	I0717 17:25:57.936064   31817 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:25:57.936323   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:57.938795   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939097   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.939122   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.939256   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:57.941132   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941439   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:57.941465   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:57.941555   31817 provision.go:143] copyHostCerts
	I0717 17:25:57.941591   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941628   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:25:57.941644   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:25:57.941723   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:25:57.941842   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941865   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:25:57.941872   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:25:57.941911   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:25:57.941974   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942004   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:25:57.942014   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:25:57.942052   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:25:57.942132   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:25:58.111694   31817 provision.go:177] copyRemoteCerts
	I0717 17:25:58.111759   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:25:58.111785   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.114260   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114541   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.114565   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.114746   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.114900   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.115022   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.115159   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.204834   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:25:58.204915   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:25:58.233451   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:25:58.233504   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0717 17:25:58.260715   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:25:58.260793   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:25:58.288074   31817 provision.go:87] duration metric: took 352.00837ms to configureAuth
	I0717 17:25:58.288100   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:25:58.288281   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:25:58.288301   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:25:58.288311   31817 main.go:141] libmachine: (ha-333994) Calling .GetURL
	I0717 17:25:58.289444   31817 main.go:141] libmachine: (ha-333994) DBG | Using libvirt version 6000000
	I0717 17:25:58.291569   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.291932   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.291957   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.292117   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:25:58.292130   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:25:58.292136   31817 client.go:171] duration metric: took 20.833465773s to LocalClient.Create
	I0717 17:25:58.292154   31817 start.go:167] duration metric: took 20.833518022s to libmachine.API.Create "ha-333994"
	I0717 17:25:58.292162   31817 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:25:58.292170   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:25:58.292186   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.292380   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:25:58.292412   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.294705   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.294988   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.295011   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.295156   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.295308   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.295448   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.295547   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.380876   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:25:58.385479   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:25:58.385504   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:25:58.385563   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:25:58.385657   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:25:58.385670   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:25:58.385792   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:25:58.395135   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:25:58.422415   31817 start.go:296] duration metric: took 130.238563ms for postStartSetup
	I0717 17:25:58.422468   31817 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:25:58.423096   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.425440   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.425742   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.425767   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.426007   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:25:58.426198   31817 start.go:128] duration metric: took 20.984856664s to createHost
	I0717 17:25:58.426221   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.428248   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428511   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.428538   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.428637   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.428826   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.428930   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.429005   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.429097   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:25:58.429257   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:25:58.429266   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:25:58.543836   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237158.504657493
	
	I0717 17:25:58.543858   31817 fix.go:216] guest clock: 1721237158.504657493
	I0717 17:25:58.543867   31817 fix.go:229] Guest: 2024-07-17 17:25:58.504657493 +0000 UTC Remote: 2024-07-17 17:25:58.426211523 +0000 UTC m=+21.086147695 (delta=78.44597ms)
	I0717 17:25:58.543886   31817 fix.go:200] guest clock delta is within tolerance: 78.44597ms
	I0717 17:25:58.543891   31817 start.go:83] releasing machines lock for "ha-333994", held for 21.102620399s
	I0717 17:25:58.543907   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.544173   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:25:58.546693   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547047   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.547072   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.547197   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547654   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547823   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:25:58.547916   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:25:58.547962   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.548054   31817 ssh_runner.go:195] Run: cat /version.json
	I0717 17:25:58.548080   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:25:58.550378   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550648   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550679   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.550978   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.550982   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551129   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551187   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:25:58.551227   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.551240   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:25:58.551305   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:25:58.551318   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.551480   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:25:58.551686   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:25:58.552927   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:25:58.654133   31817 ssh_runner.go:195] Run: systemctl --version
	I0717 17:25:58.660072   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:25:58.665532   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:25:58.665586   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:25:58.682884   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:25:58.682906   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:25:58.682966   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:25:58.710921   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:25:58.724815   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:25:58.724862   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:25:58.738870   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:25:58.752912   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:25:58.873905   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:25:59.009226   31817 docker.go:233] disabling docker service ...
	I0717 17:25:59.009286   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:25:59.024317   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:25:59.037729   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:25:59.178928   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:25:59.308950   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:25:59.322702   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:25:59.341915   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:25:59.352890   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:25:59.363450   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:25:59.363513   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:25:59.374006   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.384984   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:25:59.395933   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:25:59.406370   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:25:59.416834   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:25:59.427824   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:25:59.438419   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:25:59.448933   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:25:59.458271   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:25:59.458321   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:25:59.471288   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:25:59.480733   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:25:59.597561   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:25:59.625448   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:25:59.625540   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:25:59.630090   31817 retry.go:31] will retry after 1.114753424s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:00.745398   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:00.750563   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:00.750619   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:00.754270   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:00.794015   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:00.794075   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.821370   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:00.850476   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:00.851699   31817 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:26:00.854267   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854598   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:00.854625   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:00.854810   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:00.858914   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:00.872028   31817 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:26:00.872129   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:00.872173   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:00.904349   31817 containerd.go:623] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.30.2". assuming images are not preloaded.
	I0717 17:26:00.904418   31817 ssh_runner.go:195] Run: which lz4
	I0717 17:26:00.908264   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 -> /preloaded.tar.lz4
	I0717 17:26:00.908363   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0717 17:26:00.912476   31817 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0717 17:26:00.912508   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (394473408 bytes)
	I0717 17:26:02.292043   31817 containerd.go:563] duration metric: took 1.383715694s to copy over tarball
	I0717 17:26:02.292124   31817 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0717 17:26:04.380435   31817 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.088281526s)
	I0717 17:26:04.380473   31817 containerd.go:570] duration metric: took 2.088397847s to extract the tarball
	I0717 17:26:04.380483   31817 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0717 17:26:04.417289   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:04.532503   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:04.562019   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.594139   31817 retry.go:31] will retry after 159.715137ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-07-17T17:26:04Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0717 17:26:04.754516   31817 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:26:04.790521   31817 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:26:04.790541   31817 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:26:04.790548   31817 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:26:04.790647   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:04.790702   31817 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:26:04.826334   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:04.826357   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:04.826364   31817 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:26:04.826385   31817 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:26:04.826538   31817 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:26:04.826560   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:04.826608   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:04.845088   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:04.845186   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/super-admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:04.845237   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:04.855420   31817 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:26:04.855490   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:26:04.865095   31817 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:26:04.882653   31817 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:26:04.899447   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:26:04.917467   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1447 bytes)
	I0717 17:26:04.934831   31817 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:26:04.938924   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:04.951512   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:05.064475   31817 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:26:05.091657   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:26:05.091681   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:05.091701   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.091873   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:05.091927   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:05.091942   31817 certs.go:256] generating profile certs ...
	I0717 17:26:05.092017   31817 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:05.092036   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt with IP's: []
	I0717 17:26:05.333847   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt ...
	I0717 17:26:05.333874   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt: {Name:mk777cbb40105a68e3f77323fe294b684956fe92 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334027   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key ...
	I0717 17:26:05.334037   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key: {Name:mk5d028eb3d5165101367caeb298d78e1ef97418 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.334107   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e
	I0717 17:26:05.334145   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.254]
	I0717 17:26:05.424786   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e ...
	I0717 17:26:05.424814   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e: {Name:mk0136c8aa6e3dcb0178d33e23c8a472c3572950 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.424956   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e ...
	I0717 17:26:05.424968   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e: {Name:mk21a2bd5914e6b9398865902ece829e628c40ef Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.425035   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:05.425116   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.7fec389e -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:05.425167   31817 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:05.425180   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt with IP's: []
	I0717 17:26:05.709359   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt ...
	I0717 17:26:05.709387   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt: {Name:mk00da479f15831c3fb1174ab8fe01112b152616 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709526   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key ...
	I0717 17:26:05.709536   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key: {Name:mk48280e7c358eaec39922f30f6427d18e40d4e2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:05.709599   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:05.709615   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:05.709625   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:05.709637   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:05.709649   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:05.709664   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:05.709675   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:05.709686   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:05.709732   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:05.709772   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:05.709781   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:05.709804   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:05.709828   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:05.709854   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:05.709889   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:05.709937   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:05.709953   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:05.709962   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:05.710499   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:05.736286   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:05.762624   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:05.789813   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:05.816731   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1424 bytes)
	I0717 17:26:05.843922   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0717 17:26:05.890090   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:05.917641   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:05.942689   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:05.968245   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:05.991503   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:06.014644   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:26:06.030964   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:06.036668   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:06.047444   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051872   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.051933   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:06.057696   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:06.068885   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:06.079816   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084516   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.084582   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:06.090194   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:06.100911   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:06.112203   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116753   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.116812   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:06.122686   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:06.133462   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:06.137718   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:06.137774   31817 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] Moun
tPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:26:06.137852   31817 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:26:06.137906   31817 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:26:06.181182   31817 cri.go:89] found id: ""
	I0717 17:26:06.181252   31817 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:26:06.191588   31817 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0717 17:26:06.201776   31817 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0717 17:26:06.211610   31817 kubeadm.go:155] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0717 17:26:06.211628   31817 kubeadm.go:157] found existing configuration files:
	
	I0717 17:26:06.211668   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0717 17:26:06.221376   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0717 17:26:06.221428   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0717 17:26:06.231162   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0717 17:26:06.240465   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0717 17:26:06.240520   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0717 17:26:06.250464   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.260016   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0717 17:26:06.260071   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0717 17:26:06.269931   31817 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0717 17:26:06.279357   31817 kubeadm.go:163] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0717 17:26:06.279423   31817 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0717 17:26:06.289124   31817 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.30.2:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0717 17:26:06.540765   31817 kubeadm.go:310] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0717 17:26:16.854837   31817 kubeadm.go:310] [init] Using Kubernetes version: v1.30.2
	I0717 17:26:16.854895   31817 kubeadm.go:310] [preflight] Running pre-flight checks
	I0717 17:26:16.854996   31817 kubeadm.go:310] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0717 17:26:16.855136   31817 kubeadm.go:310] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0717 17:26:16.855227   31817 kubeadm.go:310] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0717 17:26:16.855281   31817 kubeadm.go:310] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0717 17:26:16.856908   31817 out.go:204]   - Generating certificates and keys ...
	I0717 17:26:16.856974   31817 kubeadm.go:310] [certs] Using existing ca certificate authority
	I0717 17:26:16.857030   31817 kubeadm.go:310] [certs] Using existing apiserver certificate and key on disk
	I0717 17:26:16.857098   31817 kubeadm.go:310] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0717 17:26:16.857147   31817 kubeadm.go:310] [certs] Generating "front-proxy-ca" certificate and key
	I0717 17:26:16.857206   31817 kubeadm.go:310] [certs] Generating "front-proxy-client" certificate and key
	I0717 17:26:16.857246   31817 kubeadm.go:310] [certs] Generating "etcd/ca" certificate and key
	I0717 17:26:16.857299   31817 kubeadm.go:310] [certs] Generating "etcd/server" certificate and key
	I0717 17:26:16.857447   31817 kubeadm.go:310] [certs] etcd/server serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857539   31817 kubeadm.go:310] [certs] Generating "etcd/peer" certificate and key
	I0717 17:26:16.857713   31817 kubeadm.go:310] [certs] etcd/peer serving cert is signed for DNS names [ha-333994 localhost] and IPs [192.168.39.180 127.0.0.1 ::1]
	I0717 17:26:16.857815   31817 kubeadm.go:310] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0717 17:26:16.857909   31817 kubeadm.go:310] [certs] Generating "apiserver-etcd-client" certificate and key
	I0717 17:26:16.857973   31817 kubeadm.go:310] [certs] Generating "sa" key and public key
	I0717 17:26:16.858063   31817 kubeadm.go:310] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0717 17:26:16.858158   31817 kubeadm.go:310] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0717 17:26:16.858237   31817 kubeadm.go:310] [kubeconfig] Writing "super-admin.conf" kubeconfig file
	I0717 17:26:16.858285   31817 kubeadm.go:310] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0717 17:26:16.858338   31817 kubeadm.go:310] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0717 17:26:16.858384   31817 kubeadm.go:310] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0717 17:26:16.858464   31817 kubeadm.go:310] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0717 17:26:16.858535   31817 kubeadm.go:310] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0717 17:26:16.860941   31817 out.go:204]   - Booting up control plane ...
	I0717 17:26:16.861023   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0717 17:26:16.861114   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0717 17:26:16.861201   31817 kubeadm.go:310] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0717 17:26:16.861312   31817 kubeadm.go:310] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0717 17:26:16.861419   31817 kubeadm.go:310] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0717 17:26:16.861463   31817 kubeadm.go:310] [kubelet-start] Starting the kubelet
	I0717 17:26:16.861573   31817 kubeadm.go:310] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests"
	I0717 17:26:16.861661   31817 kubeadm.go:310] [kubelet-check] Waiting for a healthy kubelet. This can take up to 4m0s
	I0717 17:26:16.861750   31817 kubeadm.go:310] [kubelet-check] The kubelet is healthy after 501.96481ms
	I0717 17:26:16.861834   31817 kubeadm.go:310] [api-check] Waiting for a healthy API server. This can take up to 4m0s
	I0717 17:26:16.861884   31817 kubeadm.go:310] [api-check] The API server is healthy after 5.974489427s
	I0717 17:26:16.862127   31817 kubeadm.go:310] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0717 17:26:16.862266   31817 kubeadm.go:310] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0717 17:26:16.862320   31817 kubeadm.go:310] [upload-certs] Skipping phase. Please see --upload-certs
	I0717 17:26:16.862517   31817 kubeadm.go:310] [mark-control-plane] Marking the node ha-333994 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0717 17:26:16.862583   31817 kubeadm.go:310] [bootstrap-token] Using token: nha8at.aampri4d84mofmvm
	I0717 17:26:16.863863   31817 out.go:204]   - Configuring RBAC rules ...
	I0717 17:26:16.863958   31817 kubeadm.go:310] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0717 17:26:16.864053   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0717 17:26:16.864187   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0717 17:26:16.864354   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0717 17:26:16.864468   31817 kubeadm.go:310] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0717 17:26:16.864606   31817 kubeadm.go:310] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0717 17:26:16.864779   31817 kubeadm.go:310] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0717 17:26:16.864819   31817 kubeadm.go:310] [addons] Applied essential addon: CoreDNS
	I0717 17:26:16.864861   31817 kubeadm.go:310] [addons] Applied essential addon: kube-proxy
	I0717 17:26:16.864867   31817 kubeadm.go:310] 
	I0717 17:26:16.864915   31817 kubeadm.go:310] Your Kubernetes control-plane has initialized successfully!
	I0717 17:26:16.864921   31817 kubeadm.go:310] 
	I0717 17:26:16.864989   31817 kubeadm.go:310] To start using your cluster, you need to run the following as a regular user:
	I0717 17:26:16.865003   31817 kubeadm.go:310] 
	I0717 17:26:16.865036   31817 kubeadm.go:310]   mkdir -p $HOME/.kube
	I0717 17:26:16.865087   31817 kubeadm.go:310]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0717 17:26:16.865148   31817 kubeadm.go:310]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0717 17:26:16.865158   31817 kubeadm.go:310] 
	I0717 17:26:16.865241   31817 kubeadm.go:310] Alternatively, if you are the root user, you can run:
	I0717 17:26:16.865256   31817 kubeadm.go:310] 
	I0717 17:26:16.865326   31817 kubeadm.go:310]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0717 17:26:16.865337   31817 kubeadm.go:310] 
	I0717 17:26:16.865412   31817 kubeadm.go:310] You should now deploy a pod network to the cluster.
	I0717 17:26:16.865511   31817 kubeadm.go:310] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0717 17:26:16.865586   31817 kubeadm.go:310]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0717 17:26:16.865592   31817 kubeadm.go:310] 
	I0717 17:26:16.865681   31817 kubeadm.go:310] You can now join any number of control-plane nodes by copying certificate authorities
	I0717 17:26:16.865783   31817 kubeadm.go:310] and service account keys on each node and then running the following as root:
	I0717 17:26:16.865794   31817 kubeadm.go:310] 
	I0717 17:26:16.865910   31817 kubeadm.go:310]   kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866069   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 \
	I0717 17:26:16.866105   31817 kubeadm.go:310] 	--control-plane 
	I0717 17:26:16.866127   31817 kubeadm.go:310] 
	I0717 17:26:16.866222   31817 kubeadm.go:310] Then you can join any number of worker nodes by running the following on each as root:
	I0717 17:26:16.866229   31817 kubeadm.go:310] 
	I0717 17:26:16.866315   31817 kubeadm.go:310] kubeadm join control-plane.minikube.internal:8443 --token nha8at.aampri4d84mofmvm \
	I0717 17:26:16.866474   31817 kubeadm.go:310] 	--discovery-token-ca-cert-hash sha256:a60e42bdf4c234276b18cf44d8d4bb8b184659f5dc63b21861fc880bef0ea484 
	I0717 17:26:16.866487   31817 cni.go:84] Creating CNI manager for ""
	I0717 17:26:16.866496   31817 cni.go:136] multinode detected (1 nodes found), recommending kindnet
	I0717 17:26:16.867885   31817 out.go:177] * Configuring CNI (Container Networking Interface) ...
	I0717 17:26:16.868963   31817 ssh_runner.go:195] Run: stat /opt/cni/bin/portmap
	I0717 17:26:16.874562   31817 cni.go:182] applying CNI manifest using /var/lib/minikube/binaries/v1.30.2/kubectl ...
	I0717 17:26:16.874582   31817 ssh_runner.go:362] scp memory --> /var/tmp/minikube/cni.yaml (2438 bytes)
	I0717 17:26:16.893967   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl apply --kubeconfig=/var/lib/minikube/kubeconfig -f /var/tmp/minikube/cni.yaml
	I0717 17:26:17.240919   31817 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0717 17:26:17.241000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.241050   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes ha-333994 minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700 minikube.k8s.io/version=v1.33.1 minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86 minikube.k8s.io/name=ha-333994 minikube.k8s.io/primary=true
	I0717 17:26:17.265880   31817 ops.go:34] apiserver oom_adj: -16
	I0717 17:26:17.373587   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:17.874354   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.374127   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:18.874198   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.374489   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:19.874572   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.373924   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:20.874355   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.373893   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:21.874071   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.374000   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:22.873730   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.374382   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:23.874233   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.374181   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:24.874599   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.374533   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:25.874592   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.373806   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:26.874333   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.373913   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:27.874327   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.373877   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:28.873887   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.374632   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:29.874052   31817 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.30.2/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0717 17:26:30.024970   31817 kubeadm.go:1113] duration metric: took 12.784009766s to wait for elevateKubeSystemPrivileges
	I0717 17:26:30.025013   31817 kubeadm.go:394] duration metric: took 23.887240562s to StartCluster
	I0717 17:26:30.025031   31817 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.025112   31817 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.026088   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:30.026357   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0717 17:26:30.026385   31817 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:30.026411   31817 start.go:241] waiting for startup goroutines ...
	I0717 17:26:30.026428   31817 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:26:30.026497   31817 addons.go:69] Setting storage-provisioner=true in profile "ha-333994"
	I0717 17:26:30.026512   31817 addons.go:69] Setting default-storageclass=true in profile "ha-333994"
	I0717 17:26:30.026541   31817 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "ha-333994"
	I0717 17:26:30.026571   31817 addons.go:234] Setting addon storage-provisioner=true in "ha-333994"
	I0717 17:26:30.026609   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:30.026621   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.026938   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.026980   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.026991   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.027043   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.041651   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42585
	I0717 17:26:30.042154   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35951
	I0717 17:26:30.042786   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.043559   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.043586   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.043583   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.044032   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044132   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.044154   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.044459   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.044627   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.045452   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.045489   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.046872   31817 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:26:30.047164   31817 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:26:30.047615   31817 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:26:30.047786   31817 addons.go:234] Setting addon default-storageclass=true in "ha-333994"
	I0717 17:26:30.047815   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:30.048048   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.048070   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.062004   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39399
	I0717 17:26:30.062451   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.062948   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.062973   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.063274   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.063821   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:30.063852   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:30.064986   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41061
	I0717 17:26:30.065414   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.066072   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.066093   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.066486   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.066685   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.068400   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.070565   31817 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0717 17:26:30.072061   31817 addons.go:431] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.072111   31817 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0717 17:26:30.072172   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.075414   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.075887   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.075945   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.076100   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.076283   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.076404   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.076550   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.080633   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38227
	I0717 17:26:30.081042   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:30.081529   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:30.081553   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:30.081832   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:30.082004   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:30.083501   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:30.083712   31817 addons.go:431] installing /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.083728   31817 ssh_runner.go:362] scp storageclass/storageclass.yaml --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0717 17:26:30.083744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:30.086186   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086587   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:30.086610   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:30.086776   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:30.086954   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:30.087117   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:30.087256   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:30.228292   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.30.2/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0717 17:26:30.301671   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0717 17:26:30.365207   31817 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.30.2/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0717 17:26:30.867357   31817 start.go:971] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0717 17:26:30.994695   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994720   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.994814   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.994839   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995019   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995032   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995042   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995049   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995083   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995094   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995102   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:30.995109   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:30.995113   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995338   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995354   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995425   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:30.995442   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:30.995454   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:30.995583   31817 round_trippers.go:463] GET https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses
	I0717 17:26:30.995597   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:30.995607   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:30.995615   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.008616   31817 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0717 17:26:31.009189   31817 round_trippers.go:463] PUT https://192.168.39.254:8443/apis/storage.k8s.io/v1/storageclasses/standard
	I0717 17:26:31.009203   31817 round_trippers.go:469] Request Headers:
	I0717 17:26:31.009211   31817 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:26:31.009218   31817 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:26:31.009222   31817 round_trippers.go:473]     Content-Type: application/json
	I0717 17:26:31.018362   31817 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0717 17:26:31.018530   31817 main.go:141] libmachine: Making call to close driver server
	I0717 17:26:31.018542   31817 main.go:141] libmachine: (ha-333994) Calling .Close
	I0717 17:26:31.018820   31817 main.go:141] libmachine: Successfully made call to close driver server
	I0717 17:26:31.018857   31817 main.go:141] libmachine: (ha-333994) DBG | Closing plugin on server side
	I0717 17:26:31.018879   31817 main.go:141] libmachine: Making call to close connection to plugin binary
	I0717 17:26:31.020620   31817 out.go:177] * Enabled addons: storage-provisioner, default-storageclass
	I0717 17:26:31.022095   31817 addons.go:510] duration metric: took 995.669545ms for enable addons: enabled=[storage-provisioner default-storageclass]
	I0717 17:26:31.022154   31817 start.go:246] waiting for cluster config update ...
	I0717 17:26:31.022168   31817 start.go:255] writing updated cluster config ...
	I0717 17:26:31.023733   31817 out.go:177] 
	I0717 17:26:31.025261   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:31.025354   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.027151   31817 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:26:31.028468   31817 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:26:31.028493   31817 cache.go:56] Caching tarball of preloaded images
	I0717 17:26:31.028581   31817 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:26:31.028597   31817 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:26:31.028681   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:31.028874   31817 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:26:31.028940   31817 start.go:364] duration metric: took 41.632µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:26:31.028968   31817 start.go:93] Provisioning new machine with config: &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m02 IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:26:31.029076   31817 start.go:125] createHost starting for "m02" (driver="kvm2")
	I0717 17:26:31.030724   31817 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0717 17:26:31.030825   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:31.030857   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:31.044970   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37577
	I0717 17:26:31.045405   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:31.045822   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:31.045844   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:31.046177   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:31.046354   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:31.046509   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:31.046649   31817 start.go:159] libmachine.API.Create for "ha-333994" (driver="kvm2")
	I0717 17:26:31.046672   31817 client.go:168] LocalClient.Create starting
	I0717 17:26:31.046708   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem
	I0717 17:26:31.046743   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046763   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046824   31817 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem
	I0717 17:26:31.046847   31817 main.go:141] libmachine: Decoding PEM data...
	I0717 17:26:31.046863   31817 main.go:141] libmachine: Parsing certificate...
	I0717 17:26:31.046888   31817 main.go:141] libmachine: Running pre-create checks...
	I0717 17:26:31.046900   31817 main.go:141] libmachine: (ha-333994-m02) Calling .PreCreateCheck
	I0717 17:26:31.047078   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:31.047493   31817 main.go:141] libmachine: Creating machine...
	I0717 17:26:31.047506   31817 main.go:141] libmachine: (ha-333994-m02) Calling .Create
	I0717 17:26:31.047622   31817 main.go:141] libmachine: (ha-333994-m02) Creating KVM machine...
	I0717 17:26:31.048765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing default KVM network
	I0717 17:26:31.048898   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found existing private KVM network mk-ha-333994
	I0717 17:26:31.048996   31817 main.go:141] libmachine: (ha-333994-m02) Setting up store path in /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.049023   31817 main.go:141] libmachine: (ha-333994-m02) Building disk image from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:26:31.049102   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.048983   32198 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.049157   31817 main.go:141] libmachine: (ha-333994-m02) Downloading /home/jenkins/minikube-integration/19283-14409/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso...
	I0717 17:26:31.264550   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.264392   32198 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa...
	I0717 17:26:31.437178   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437075   32198 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk...
	I0717 17:26:31.437206   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing magic tar header
	I0717 17:26:31.437216   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Writing SSH key tar header
	I0717 17:26:31.437287   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:31.437231   32198 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 ...
	I0717 17:26:31.437381   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02
	I0717 17:26:31.437404   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube/machines
	I0717 17:26:31.437414   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02 (perms=drwx------)
	I0717 17:26:31.437427   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube/machines (perms=drwxr-xr-x)
	I0717 17:26:31.437434   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409/.minikube (perms=drwxr-xr-x)
	I0717 17:26:31.437446   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration/19283-14409 (perms=drwxrwxr-x)
	I0717 17:26:31.437455   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0717 17:26:31.437469   31817 main.go:141] libmachine: (ha-333994-m02) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0717 17:26:31.437487   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:26:31.437496   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:31.437506   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/19283-14409
	I0717 17:26:31.437514   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0717 17:26:31.437521   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home/jenkins
	I0717 17:26:31.437528   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Checking permissions on dir: /home
	I0717 17:26:31.437535   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Skipping /home - not owner
	I0717 17:26:31.438521   31817 main.go:141] libmachine: (ha-333994-m02) define libvirt domain using xml: 
	I0717 17:26:31.438545   31817 main.go:141] libmachine: (ha-333994-m02) <domain type='kvm'>
	I0717 17:26:31.438556   31817 main.go:141] libmachine: (ha-333994-m02)   <name>ha-333994-m02</name>
	I0717 17:26:31.438567   31817 main.go:141] libmachine: (ha-333994-m02)   <memory unit='MiB'>2200</memory>
	I0717 17:26:31.438579   31817 main.go:141] libmachine: (ha-333994-m02)   <vcpu>2</vcpu>
	I0717 17:26:31.438584   31817 main.go:141] libmachine: (ha-333994-m02)   <features>
	I0717 17:26:31.438589   31817 main.go:141] libmachine: (ha-333994-m02)     <acpi/>
	I0717 17:26:31.438593   31817 main.go:141] libmachine: (ha-333994-m02)     <apic/>
	I0717 17:26:31.438600   31817 main.go:141] libmachine: (ha-333994-m02)     <pae/>
	I0717 17:26:31.438604   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.438610   31817 main.go:141] libmachine: (ha-333994-m02)   </features>
	I0717 17:26:31.438614   31817 main.go:141] libmachine: (ha-333994-m02)   <cpu mode='host-passthrough'>
	I0717 17:26:31.438621   31817 main.go:141] libmachine: (ha-333994-m02)   
	I0717 17:26:31.438628   31817 main.go:141] libmachine: (ha-333994-m02)   </cpu>
	I0717 17:26:31.438640   31817 main.go:141] libmachine: (ha-333994-m02)   <os>
	I0717 17:26:31.438654   31817 main.go:141] libmachine: (ha-333994-m02)     <type>hvm</type>
	I0717 17:26:31.438664   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='cdrom'/>
	I0717 17:26:31.438671   31817 main.go:141] libmachine: (ha-333994-m02)     <boot dev='hd'/>
	I0717 17:26:31.438679   31817 main.go:141] libmachine: (ha-333994-m02)     <bootmenu enable='no'/>
	I0717 17:26:31.438683   31817 main.go:141] libmachine: (ha-333994-m02)   </os>
	I0717 17:26:31.438688   31817 main.go:141] libmachine: (ha-333994-m02)   <devices>
	I0717 17:26:31.438696   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='cdrom'>
	I0717 17:26:31.438705   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/boot2docker.iso'/>
	I0717 17:26:31.438717   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hdc' bus='scsi'/>
	I0717 17:26:31.438728   31817 main.go:141] libmachine: (ha-333994-m02)       <readonly/>
	I0717 17:26:31.438741   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438755   31817 main.go:141] libmachine: (ha-333994-m02)     <disk type='file' device='disk'>
	I0717 17:26:31.438807   31817 main.go:141] libmachine: (ha-333994-m02)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0717 17:26:31.438833   31817 main.go:141] libmachine: (ha-333994-m02)       <source file='/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/ha-333994-m02.rawdisk'/>
	I0717 17:26:31.438839   31817 main.go:141] libmachine: (ha-333994-m02)       <target dev='hda' bus='virtio'/>
	I0717 17:26:31.438845   31817 main.go:141] libmachine: (ha-333994-m02)     </disk>
	I0717 17:26:31.438850   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438856   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='mk-ha-333994'/>
	I0717 17:26:31.438860   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438865   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438871   31817 main.go:141] libmachine: (ha-333994-m02)     <interface type='network'>
	I0717 17:26:31.438883   31817 main.go:141] libmachine: (ha-333994-m02)       <source network='default'/>
	I0717 17:26:31.438890   31817 main.go:141] libmachine: (ha-333994-m02)       <model type='virtio'/>
	I0717 17:26:31.438898   31817 main.go:141] libmachine: (ha-333994-m02)     </interface>
	I0717 17:26:31.438911   31817 main.go:141] libmachine: (ha-333994-m02)     <serial type='pty'>
	I0717 17:26:31.438923   31817 main.go:141] libmachine: (ha-333994-m02)       <target port='0'/>
	I0717 17:26:31.438931   31817 main.go:141] libmachine: (ha-333994-m02)     </serial>
	I0717 17:26:31.438942   31817 main.go:141] libmachine: (ha-333994-m02)     <console type='pty'>
	I0717 17:26:31.438953   31817 main.go:141] libmachine: (ha-333994-m02)       <target type='serial' port='0'/>
	I0717 17:26:31.438964   31817 main.go:141] libmachine: (ha-333994-m02)     </console>
	I0717 17:26:31.438974   31817 main.go:141] libmachine: (ha-333994-m02)     <rng model='virtio'>
	I0717 17:26:31.438987   31817 main.go:141] libmachine: (ha-333994-m02)       <backend model='random'>/dev/random</backend>
	I0717 17:26:31.438999   31817 main.go:141] libmachine: (ha-333994-m02)     </rng>
	I0717 17:26:31.439010   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439021   31817 main.go:141] libmachine: (ha-333994-m02)     
	I0717 17:26:31.439030   31817 main.go:141] libmachine: (ha-333994-m02)   </devices>
	I0717 17:26:31.439039   31817 main.go:141] libmachine: (ha-333994-m02) </domain>
	I0717 17:26:31.439049   31817 main.go:141] libmachine: (ha-333994-m02) 
	I0717 17:26:31.445546   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:e9:27:93 in network default
	I0717 17:26:31.446057   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:26:31.446081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:31.446683   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:26:31.446957   31817 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:26:31.447352   31817 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:26:31.447953   31817 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:26:32.668554   31817 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:26:32.669421   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.669837   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.669869   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.669821   32198 retry.go:31] will retry after 265.908605ms: waiting for machine to come up
	I0717 17:26:32.937392   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:32.937818   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:32.937841   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:32.937787   32198 retry.go:31] will retry after 263.816332ms: waiting for machine to come up
	I0717 17:26:33.203484   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.203889   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.203915   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.203865   32198 retry.go:31] will retry after 370.046003ms: waiting for machine to come up
	I0717 17:26:33.575157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:33.575547   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:33.575577   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:33.575470   32198 retry.go:31] will retry after 487.691796ms: waiting for machine to come up
	I0717 17:26:34.065171   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.065647   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.065668   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.065610   32198 retry.go:31] will retry after 737.756145ms: waiting for machine to come up
	I0717 17:26:34.804469   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:34.804805   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:34.804833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:34.804748   32198 retry.go:31] will retry after 716.008929ms: waiting for machine to come up
	I0717 17:26:35.522742   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:35.523151   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:35.523175   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:35.523122   32198 retry.go:31] will retry after 1.039877882s: waiting for machine to come up
	I0717 17:26:36.564784   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:36.565187   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:36.565236   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:36.565168   32198 retry.go:31] will retry after 946.347249ms: waiting for machine to come up
	I0717 17:26:37.513629   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:37.514132   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:37.514159   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:37.514078   32198 retry.go:31] will retry after 1.425543571s: waiting for machine to come up
	I0717 17:26:38.941439   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:38.941914   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:38.941941   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:38.941867   32198 retry.go:31] will retry after 2.252250366s: waiting for machine to come up
	I0717 17:26:41.195297   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:41.195830   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:41.195853   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:41.195783   32198 retry.go:31] will retry after 2.725572397s: waiting for machine to come up
	I0717 17:26:43.922616   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:43.923015   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:43.923039   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:43.922970   32198 retry.go:31] will retry after 3.508475549s: waiting for machine to come up
	I0717 17:26:47.432839   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:47.433277   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:26:47.433306   31817 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:26:47.433245   32198 retry.go:31] will retry after 3.328040591s: waiting for machine to come up
	I0717 17:26:50.765649   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766087   31817 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:26:50.766108   31817 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:26:50.766147   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.766429   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994
	I0717 17:26:50.835843   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:50.835875   31817 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:26:50.835890   31817 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:26:50.838442   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:50.838833   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994
	I0717 17:26:50.838858   31817 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find defined IP address of network mk-ha-333994 interface with MAC address 52:54:00:b1:0f:81
	I0717 17:26:50.839017   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:50.839052   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:50.839081   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:50.839104   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:50.839121   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:50.842964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: exit status 255: 
	I0717 17:26:50.842984   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0717 17:26:50.842995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | command : exit 0
	I0717 17:26:50.843004   31817 main.go:141] libmachine: (ha-333994-m02) DBG | err     : exit status 255
	I0717 17:26:50.843028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | output  : 
	I0717 17:26:53.843162   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:26:53.845524   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.845912   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.845964   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.846160   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:26:53.846190   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:26:53.846218   31817 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:26:53.846237   31817 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:26:53.846249   31817 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:26:53.977891   31817 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:26:53.978192   31817 main.go:141] libmachine: (ha-333994-m02) KVM machine creation complete!
	I0717 17:26:53.978493   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:53.979005   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979196   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:53.979349   31817 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0717 17:26:53.979361   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:26:53.980446   31817 main.go:141] libmachine: Detecting operating system of created instance...
	I0717 17:26:53.980458   31817 main.go:141] libmachine: Waiting for SSH to be available...
	I0717 17:26:53.980463   31817 main.go:141] libmachine: Getting to WaitForSSH function...
	I0717 17:26:53.980469   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:53.982666   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983028   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:53.983061   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:53.983193   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:53.983351   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983482   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:53.983592   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:53.983736   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:53.983941   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:53.983953   31817 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0717 17:26:54.097606   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.097631   31817 main.go:141] libmachine: Detecting the provisioner...
	I0717 17:26:54.097638   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.100274   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.100626   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.100772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.100954   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101115   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.101230   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.101387   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.101557   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.101569   31817 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0717 17:26:54.214758   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0717 17:26:54.214823   31817 main.go:141] libmachine: found compatible host: buildroot
	I0717 17:26:54.214832   31817 main.go:141] libmachine: Provisioning with buildroot...
	I0717 17:26:54.214839   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215071   31817 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:26:54.215095   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.215281   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.217709   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218130   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.218157   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.218274   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.218456   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218598   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.218743   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.218879   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.219074   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.219087   31817 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:26:54.348717   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:26:54.348783   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.351584   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.351923   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.351944   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.352126   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.352288   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352474   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.352599   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.352725   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:54.352881   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:54.352895   31817 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:26:54.476331   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:26:54.476371   31817 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:26:54.476397   31817 buildroot.go:174] setting up certificates
	I0717 17:26:54.476416   31817 provision.go:84] configureAuth start
	I0717 17:26:54.476438   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:26:54.476719   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:54.479208   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479564   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.479592   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.479788   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.481800   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482086   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.482109   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.482263   31817 provision.go:143] copyHostCerts
	I0717 17:26:54.482290   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482319   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:26:54.482328   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:26:54.482388   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:26:54.482455   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482472   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:26:54.482478   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:26:54.482502   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:26:54.482542   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482558   31817 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:26:54.482564   31817 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:26:54.482584   31817 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:26:54.482627   31817 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:26:54.697157   31817 provision.go:177] copyRemoteCerts
	I0717 17:26:54.697210   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:26:54.697233   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.699959   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700263   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.700281   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.700480   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.700699   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.700860   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.701000   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.792678   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:26:54.792760   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:26:54.816985   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:26:54.817058   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:26:54.841268   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:26:54.841343   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:26:54.865093   31817 provision.go:87] duration metric: took 388.663223ms to configureAuth
	I0717 17:26:54.865120   31817 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:26:54.865311   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:54.865337   31817 main.go:141] libmachine: Checking connection to Docker...
	I0717 17:26:54.865347   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetURL
	I0717 17:26:54.866495   31817 main.go:141] libmachine: (ha-333994-m02) DBG | Using libvirt version 6000000
	I0717 17:26:54.868417   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868765   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.868792   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.868933   31817 main.go:141] libmachine: Docker is up and running!
	I0717 17:26:54.868949   31817 main.go:141] libmachine: Reticulating splines...
	I0717 17:26:54.868955   31817 client.go:171] duration metric: took 23.822273283s to LocalClient.Create
	I0717 17:26:54.868974   31817 start.go:167] duration metric: took 23.822329608s to libmachine.API.Create "ha-333994"
	I0717 17:26:54.868982   31817 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:26:54.868990   31817 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:26:54.869011   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:54.869243   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:26:54.869264   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:54.871450   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.871816   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:54.871840   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:54.872022   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:54.872180   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:54.872326   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:54.872476   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:54.961235   31817 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:26:54.965604   31817 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:26:54.965626   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:26:54.965684   31817 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:26:54.965757   31817 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:26:54.965766   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:26:54.965847   31817 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:26:54.975595   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:54.999236   31817 start.go:296] duration metric: took 130.241349ms for postStartSetup
	I0717 17:26:54.999289   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:26:54.999814   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.002512   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.002864   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.002901   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.003161   31817 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:26:55.003366   31817 start.go:128] duration metric: took 23.974275382s to createHost
	I0717 17:26:55.003388   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.005328   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005632   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.005656   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.005830   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.006002   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006161   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.006292   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.006451   31817 main.go:141] libmachine: Using SSH client type: native
	I0717 17:26:55.006637   31817 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:26:55.006649   31817 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:26:55.122903   31817 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721237215.098211807
	
	I0717 17:26:55.122928   31817 fix.go:216] guest clock: 1721237215.098211807
	I0717 17:26:55.122937   31817 fix.go:229] Guest: 2024-07-17 17:26:55.098211807 +0000 UTC Remote: 2024-07-17 17:26:55.003376883 +0000 UTC m=+77.663313056 (delta=94.834924ms)
	I0717 17:26:55.122956   31817 fix.go:200] guest clock delta is within tolerance: 94.834924ms
	I0717 17:26:55.122962   31817 start.go:83] releasing machines lock for "ha-333994-m02", held for 24.094009758s
	I0717 17:26:55.122986   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.123244   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:55.125631   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.125927   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.125955   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.128661   31817 out.go:177] * Found network options:
	I0717 17:26:55.130349   31817 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:26:55.131717   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.131742   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132304   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132476   31817 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:26:55.132554   31817 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:26:55.132594   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:26:55.132666   31817 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:26:55.132744   31817 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:26:55.132772   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:26:55.135185   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135477   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135501   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135519   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135642   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.135817   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.135976   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:55.135995   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:55.135977   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136127   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:26:55.136190   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:26:55.136268   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:26:55.136402   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:26:55.136527   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:26:55.220815   31817 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:26:55.220875   31817 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:26:55.245507   31817 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:26:55.245531   31817 start.go:495] detecting cgroup driver to use...
	I0717 17:26:55.245596   31817 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:26:55.278918   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:26:55.292940   31817 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:26:55.293020   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:26:55.306646   31817 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:26:55.321727   31817 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:26:55.453026   31817 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:26:55.618252   31817 docker.go:233] disabling docker service ...
	I0717 17:26:55.618323   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:26:55.633535   31817 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:26:55.647399   31817 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:26:55.767544   31817 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:26:55.888191   31817 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:26:55.901625   31817 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:26:55.919869   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:26:55.930472   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:26:55.940635   31817 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:26:55.940681   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:26:55.950966   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.961459   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:26:55.972051   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:26:55.983017   31817 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:26:55.993746   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:26:56.004081   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:26:56.014291   31817 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:26:56.024660   31817 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:26:56.033932   31817 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:26:56.033978   31817 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:26:56.047409   31817 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:26:56.057123   31817 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:26:56.196097   31817 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:26:56.227087   31817 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:26:56.227147   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:56.232659   31817 retry.go:31] will retry after 933.236719ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:26:57.166776   31817 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:26:57.172003   31817 start.go:563] Will wait 60s for crictl version
	I0717 17:26:57.172071   31817 ssh_runner.go:195] Run: which crictl
	I0717 17:26:57.176036   31817 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:26:57.214182   31817 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:26:57.214259   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.239883   31817 ssh_runner.go:195] Run: containerd --version
	I0717 17:26:57.270199   31817 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:26:57.271461   31817 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:26:57.272522   31817 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:26:57.274799   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275154   31817 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:26:45 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:26:57.275183   31817 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:26:57.275351   31817 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:26:57.279650   31817 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:26:57.293824   31817 mustload.go:65] Loading cluster: ha-333994
	I0717 17:26:57.294006   31817 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:26:57.294269   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.294293   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.308598   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36705
	I0717 17:26:57.309000   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.309480   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.309502   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.309752   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.309903   31817 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:26:57.311534   31817 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:26:57.311828   31817 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:26:57.311870   31817 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:26:57.326228   31817 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32825
	I0717 17:26:57.326552   31817 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:26:57.327001   31817 main.go:141] libmachine: Using API Version  1
	I0717 17:26:57.327022   31817 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:26:57.327287   31817 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:26:57.327462   31817 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:26:57.327619   31817 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:26:57.327627   31817 certs.go:194] generating shared ca certs ...
	I0717 17:26:57.327639   31817 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.327753   31817 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:26:57.327802   31817 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:26:57.327812   31817 certs.go:256] generating profile certs ...
	I0717 17:26:57.327877   31817 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:26:57.327900   31817 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:26:57.327913   31817 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:26:57.458239   31817 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff ...
	I0717 17:26:57.458268   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff: {Name:mke87290a04a64b5c9a3f70eca7bbd7f3ab62e57 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458428   31817 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff ...
	I0717 17:26:57.458440   31817 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff: {Name:mkcd9a6c319770e7232a22dd759a83106e261b10 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:26:57.458506   31817 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:26:57.458644   31817 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:26:57.458768   31817 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:26:57.458782   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:26:57.458794   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:26:57.458806   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:26:57.458818   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:26:57.458830   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:26:57.458841   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:26:57.458852   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:26:57.458865   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:26:57.458910   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:26:57.458936   31817 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:26:57.458945   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:26:57.458966   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:26:57.458986   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:26:57.459013   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:26:57.459048   31817 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:26:57.459071   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:26:57.459084   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:57.459095   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:26:57.459124   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:26:57.461994   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462403   31817 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:26:57.462430   31817 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:26:57.462587   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:26:57.462744   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:26:57.462905   31817 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:26:57.462996   31817 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:26:57.538412   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:26:57.543898   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:26:57.556474   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:26:57.560660   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:26:57.570923   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:26:57.574879   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:26:57.585092   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:26:57.589304   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:26:57.599639   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:26:57.603878   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:26:57.616227   31817 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:26:57.620350   31817 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:26:57.632125   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:26:57.657494   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:26:57.682754   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:26:57.707851   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:26:57.731860   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1436 bytes)
	I0717 17:26:57.757707   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:26:57.781205   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:26:57.804275   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:26:57.829670   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:26:57.855063   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:26:57.881215   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:26:57.906393   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:26:57.924441   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:26:57.942446   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:26:57.958731   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:26:57.974971   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:26:57.991007   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:26:58.006856   31817 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:26:58.023616   31817 ssh_runner.go:195] Run: openssl version
	I0717 17:26:58.029309   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:26:58.040022   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044627   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.044684   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:26:58.050556   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:26:58.060921   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:26:58.071585   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075832   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.075882   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:26:58.081281   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:26:58.091769   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:26:58.102180   31817 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106524   31817 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.106575   31817 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:26:58.112063   31817 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:26:58.122675   31817 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:26:58.126524   31817 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:26:58.126576   31817 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:26:58.126678   31817 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:26:58.126707   31817 kube-vip.go:115] generating kube-vip config ...
	I0717 17:26:58.126735   31817 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:26:58.143233   31817 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:26:58.143291   31817 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:26:58.143334   31817 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.153157   31817 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.30.2: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.30.2': No such file or directory
	
	Initiating transfer...
	I0717 17:26:58.153211   31817 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.30.2
	I0717 17:26:58.162734   31817 binary.go:74] Not caching binary, using https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256
	I0717 17:26:58.162759   31817 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl -> /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162833   31817 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl
	I0717 17:26:58.162840   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet
	I0717 17:26:58.162877   31817 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubeadm.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubeadm
	I0717 17:26:58.167096   31817 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.30.2/kubectl: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/binaries/v1.30.2/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.30.2/kubectl': No such file or directory
	I0717 17:26:58.167122   31817 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl --> /var/lib/minikube/binaries/v1.30.2/kubectl (51454104 bytes)
	I0717 17:27:14.120624   31817 out.go:177] 
	W0717 17:27:14.122586   31817 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: update node: downloading binaries: downloading kubelet: download failed: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256: getter: &{Ctx:context.Background Src:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubelet.sha256 Dst:/home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubelet.download Pwd: Mode:2 Umask:---------- Detectors:[0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920 0x49ca920] Decompressors:map[bz2:0xc000883490 gz:0xc000883498 tar:0xc000883440 tar.bz2:0xc000883450 tar.gz:0xc000883460 tar.xz:0xc000883470 tar.zst:0xc000883480 tbz2:0xc000883450 tgz:0xc000883460 txz:0xc000883470 tzst:0xc000883480 xz:0xc0008834a0 zip:0xc0008834b0 zst:0xc0008834a8] Getters:map[file:0xc000691350 http:0x
c0009febe0 https:0xc0009fec30] Dir:false ProgressListener:<nil> Insecure:false DisableSymlinks:false Options:[]}: read tcp 10.194.0.2:36556->151.101.193.55:443: read: connection reset by peer
	W0717 17:27:14.122605   31817 out.go:239] * 
	W0717 17:27:14.123461   31817 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:27:14.125013   31817 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	db107babf5b82       8c811b4aec35f       18 minutes ago      Running             busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	86b483ab22e1a       6e38f40d628db       19 minutes ago      Running             storage-provisioner       0                   4ae1e67fc3bab       storage-provisioner
	dcb6f2bdfe23d       cbb01a7bd410d       19 minutes ago      Running             coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       19 minutes ago      Running             coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       19 minutes ago      Running             kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       19 minutes ago      Running             kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	2030e6caab488       38af8ddebf499       19 minutes ago      Running             kube-vip                  0                   08971202a22cc       kube-vip-ha-333994
	d3a0374a88e2c       56ce0fd9fb532       19 minutes ago      Running             kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       19 minutes ago      Running             kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       19 minutes ago      Running             etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       19 minutes ago      Running             kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.272818878Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:0,}"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.281551441Z" level=info msg="CreateContainer within sandbox \"3e096287e39aa2659fbac6271df8b9e49c2f98bff34a88e616d0f4d213890d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.282808085Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.306661258Z" level=info msg="CreateContainer within sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" for &ContainerMetadata{Name:storage-provisioner,Attempt:0,} returns container id \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.308244470Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.405145943Z" level=info msg="StartContainer for \"5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.416098689Z" level=info msg="StartContainer for \"dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f\" returns successfully"
	Jul 17 17:26:47 ha-333994 containerd[645]: time="2024-07-17T17:26:47.459142473Z" level=info msg="StartContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.515431127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,}"
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.605927672Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606184419Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606197437Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.606895269Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.700176521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:busybox-fc5497c4f-5ngfp,Uid:5b8ac45d-057c-4c2f-9ac8-005cd6470ff6,Namespace:default,Attempt:0,} returns sandbox id \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\""
	Jul 17 17:27:16 ha-333994 containerd[645]: time="2024-07-17T17:27:16.704494262Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.067071710Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox:1.28\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.069080528Z" level=info msg="stop pulling image gcr.io/k8s-minikube/busybox:1.28: active requests=0, bytes read=725937"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.071667235Z" level=info msg="ImageCreate event name:\"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.075629687Z" level=info msg="ImageCreate event name:\"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076342636Z" level=info msg="Pulled image \"gcr.io/k8s-minikube/busybox:1.28\" with image id \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\", repo tag \"gcr.io/k8s-minikube/busybox:1.28\", repo digest \"gcr.io/k8s-minikube/busybox@sha256:9afb80db71730dbb303fe00765cbf34bddbdc6b66e49897fc2e1861967584b12\", size \"725911\" in 2.371740637s"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.076392577Z" level=info msg="PullImage \"gcr.io/k8s-minikube/busybox:1.28\" returns image reference \"sha256:8c811b4aec35f259572d0f79207bc0678df4c736eeec50bc9fec37ed936a472a\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.081681382Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for container &ContainerMetadata{Name:busybox,Attempt:0,}"
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.112976990Z" level=info msg="CreateContainer within sandbox \"d9ed5134ca786a315dca1fe3c6539b34e78357fb73fa044c29c355bc761cfea4\" for &ContainerMetadata{Name:busybox,Attempt:0,} returns container id \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.114037685Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\""
	Jul 17 17:27:19 ha-333994 containerd[645]: time="2024-07-17T17:27:19.181248193Z" level=info msg="StartContainer for \"db107babf5b82c0155b5870fee0f6a9b29a3ff7c5baf85111b044cf8475b54ed\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:45:58 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:43:08 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    b53aa9e9-08a4-4435-bef0-7135f94a954e
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     19m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     19m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         19m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      19m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         19m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 19m                kube-proxy       
	  Normal  Starting                 19m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  19m (x4 over 19m)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    19m (x4 over 19m)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     19m (x3 over 19m)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  19m                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 19m                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  19m                kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    19m                kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     19m                kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  19m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           19m                node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                19m                kubelet          Node ha-333994 status is now: NodeReady
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:46:02 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:40:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         18m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      5m50s
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m50s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 5m45s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  5m50s (x2 over 5m50s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    5m50s (x2 over 5m50s)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m50s (x2 over 5m50s)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m50s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m46s                  node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                5m31s                  kubelet          Node ha-333994-m03 status is now: NodeReady
	
	
	==> dmesg <==
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050377] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040128] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.544620] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.311602] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +4.612117] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +5.994239] systemd-fstab-generator[509]: Ignoring "noauto" option for root device
	[  +0.059342] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.054424] systemd-fstab-generator[521]: Ignoring "noauto" option for root device
	[  +0.171527] systemd-fstab-generator[535]: Ignoring "noauto" option for root device
	[  +0.142059] systemd-fstab-generator[547]: Ignoring "noauto" option for root device
	[  +0.293838] systemd-fstab-generator[578]: Ignoring "noauto" option for root device
	[Jul17 17:26] systemd-fstab-generator[637]: Ignoring "noauto" option for root device
	[  +0.060652] kauditd_printk_skb: 158 callbacks suppressed
	[  +0.475443] systemd-fstab-generator[688]: Ignoring "noauto" option for root device
	[  +3.877515] systemd-fstab-generator[863]: Ignoring "noauto" option for root device
	[  +1.168977] kauditd_printk_skb: 85 callbacks suppressed
	[  +5.141999] kauditd_printk_skb: 35 callbacks suppressed
	[  +0.960648] systemd-fstab-generator[1314]: Ignoring "noauto" option for root device
	[  +5.705099] kauditd_printk_skb: 23 callbacks suppressed
	[ +13.765378] kauditd_printk_skb: 29 callbacks suppressed
	[Jul17 17:27] kauditd_printk_skb: 26 callbacks suppressed
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.796264Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79633Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.79643Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:41:11.077099Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":1506}
	{"level":"info","ts":"2024-07-17T17:41:11.08271Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":1506,"took":"4.803656ms","hash":4135639207,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2002944,"current-db-size-in-use":"2.0 MB"}
	{"level":"info","ts":"2024-07-17T17:41:11.082934Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":4135639207,"revision":1506,"compact-revision":967}
	
	
	==> kernel <==
	 17:46:05 up 20 min,  0 users,  load average: 0.28, 0.26, 0.19
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:44:56.593925       1 main.go:303] handling current node
	I0717 17:45:06.601903       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:06.602313       1 main.go:303] handling current node
	I0717 17:45:06.602450       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:06.602539       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:16.599330       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:16.599416       1 main.go:303] handling current node
	I0717 17:45:16.599444       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:16.599450       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:26.593296       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:26.593343       1 main.go:303] handling current node
	I0717 17:45:26.593373       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:26.593378       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:36.593155       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:36.593308       1 main.go:303] handling current node
	I0717 17:45:36.593330       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:36.593336       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:46.602244       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:46.602295       1 main.go:303] handling current node
	I0717 17:45:46.602314       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:46.602320       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:45:56.593212       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:45:56.593261       1 main.go:303] handling current node
	I0717 17:45:56.593285       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:45:56.593291       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.690107       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="56.918µs"
	I0717 17:26:46.708437       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="58.561µs"
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:41:16 ha-333994 kubelet[1321]: E0717 17:41:16.469006    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:41:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:41:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:41:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:41:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:42:16 ha-333994 kubelet[1321]: E0717 17:42:16.469497    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:42:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:42:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:42:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:42:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:43:16 ha-333994 kubelet[1321]: E0717 17:43:16.470172    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:43:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:43:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:43:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:43:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:44:16 ha-333994 kubelet[1321]: E0717 17:44:16.472787    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:44:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:44:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:44:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:44:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:45:16 ha-333994 kubelet[1321]: E0717 17:45:16.469762    1321 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:45:16 ha-333994 kubelet[1321]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:45:16 ha-333994 kubelet[1321]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:45:16 ha-333994 kubelet[1321]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:45:16 ha-333994 kubelet[1321]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  8m50s (x3 over 18m)  default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  20s (x3 over 5m32s)  default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/HAppyAfterSecondaryNodeRestart (2.22s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartClusterKeepsNodes (473.38s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:456: (dbg) Run:  out/minikube-linux-amd64 node list -p ha-333994 -v=7 --alsologtostderr
ha_test.go:462: (dbg) Run:  out/minikube-linux-amd64 stop -p ha-333994 -v=7 --alsologtostderr
E0717 17:47:52.134373   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
ha_test.go:462: (dbg) Done: out/minikube-linux-amd64 stop -p ha-333994 -v=7 --alsologtostderr: (3m5.015404013s)
ha_test.go:467: (dbg) Run:  out/minikube-linux-amd64 start -p ha-333994 --wait=true -v=7 --alsologtostderr
E0717 17:49:41.797225   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 17:52:52.134535   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
ha_test.go:467: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p ha-333994 --wait=true -v=7 --alsologtostderr: exit status 80 (4m45.39361339s)

                                                
                                                
-- stdout --
	* [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on existing profile
	* Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	* Restarting existing kvm2 VM for "ha-333994" ...
	* Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	* Enabled addons: 
	
	* Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	* Restarting existing kvm2 VM for "ha-333994-m02" ...
	* Found network options:
	  - NO_PROXY=192.168.39.180
	* Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	  - env NO_PROXY=192.168.39.180
	* Verifying Kubernetes components...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:49:11.274843   39794 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:49:11.274995   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275005   39794 out.go:304] Setting ErrFile to fd 2...
	I0717 17:49:11.275011   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275192   39794 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:49:11.275748   39794 out.go:298] Setting JSON to false
	I0717 17:49:11.276624   39794 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":5494,"bootTime":1721233057,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:49:11.276685   39794 start.go:139] virtualization: kvm guest
	I0717 17:49:11.279428   39794 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:49:11.280920   39794 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:49:11.280939   39794 notify.go:220] Checking for updates...
	I0717 17:49:11.284081   39794 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:49:11.285572   39794 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:11.286973   39794 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:49:11.288259   39794 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:49:11.289617   39794 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:49:11.291360   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:11.291471   39794 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:49:11.291860   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.291910   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.306389   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41441
	I0717 17:49:11.306830   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.307340   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.307365   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.307652   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.307877   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.342518   39794 out.go:177] * Using the kvm2 driver based on existing profile
	I0717 17:49:11.343905   39794 start.go:297] selected driver: kvm2
	I0717 17:49:11.343922   39794 start.go:901] validating driver "kvm2" against &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false
ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.344074   39794 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:49:11.344385   39794 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.344460   39794 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:49:11.359473   39794 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:49:11.360126   39794 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:49:11.360191   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:11.360203   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:11.360258   39794 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false i
stio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.360356   39794 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.362215   39794 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:49:11.363497   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:11.363528   39794 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:49:11.363538   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:11.363621   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:11.363633   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:11.363751   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:11.363927   39794 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:11.363968   39794 start.go:364] duration metric: took 23.038µs to acquireMachinesLock for "ha-333994"
	I0717 17:49:11.363985   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:11.363995   39794 fix.go:54] fixHost starting: 
	I0717 17:49:11.364238   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.364269   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.378515   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45003
	I0717 17:49:11.378994   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.379458   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.379478   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.379772   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.379977   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.380153   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:11.381889   39794 fix.go:112] recreateIfNeeded on ha-333994: state=Stopped err=<nil>
	I0717 17:49:11.381920   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	W0717 17:49:11.382061   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:11.384353   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994" ...
	I0717 17:49:11.386332   39794 main.go:141] libmachine: (ha-333994) Calling .Start
	I0717 17:49:11.386525   39794 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:49:11.387295   39794 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:49:11.387605   39794 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:49:11.387902   39794 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:49:11.388700   39794 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:49:12.581316   39794 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:49:12.582199   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.582613   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.582685   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.582591   39823 retry.go:31] will retry after 292.960023ms: waiting for machine to come up
	I0717 17:49:12.877268   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.877833   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.877861   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.877756   39823 retry.go:31] will retry after 283.500887ms: waiting for machine to come up
	I0717 17:49:13.163417   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.163805   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.163826   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.163761   39823 retry.go:31] will retry after 385.368306ms: waiting for machine to come up
	I0717 17:49:13.550406   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.550840   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.550897   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.550822   39823 retry.go:31] will retry after 528.571293ms: waiting for machine to come up
	I0717 17:49:14.080602   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.081093   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.081118   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.081048   39823 retry.go:31] will retry after 736.772802ms: waiting for machine to come up
	I0717 17:49:14.818924   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.819326   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.819347   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.819281   39823 retry.go:31] will retry after 776.986347ms: waiting for machine to come up
	I0717 17:49:15.598237   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:15.598607   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:15.598627   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:15.598573   39823 retry.go:31] will retry after 1.036578969s: waiting for machine to come up
	I0717 17:49:16.637046   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:16.637440   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:16.637463   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:16.637404   39823 retry.go:31] will retry after 1.055320187s: waiting for machine to come up
	I0717 17:49:17.694838   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:17.695248   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:17.695273   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:17.695211   39823 retry.go:31] will retry after 1.335817707s: waiting for machine to come up
	I0717 17:49:19.032835   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:19.033306   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:19.033330   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:19.033266   39823 retry.go:31] will retry after 1.730964136s: waiting for machine to come up
	I0717 17:49:20.766254   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:20.766740   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:20.766768   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:20.766694   39823 retry.go:31] will retry after 2.796619276s: waiting for machine to come up
	I0717 17:49:23.566195   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:23.566759   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:23.566784   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:23.566716   39823 retry.go:31] will retry after 3.008483388s: waiting for machine to come up
	I0717 17:49:26.576866   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:26.577295   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:26.577318   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:26.577242   39823 retry.go:31] will retry after 2.889284576s: waiting for machine to come up
	I0717 17:49:29.467942   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468316   39794 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:49:29.468337   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468346   39794 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:49:29.468737   39794 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:49:29.468757   39794 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:49:29.468777   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.468804   39794 main.go:141] libmachine: (ha-333994) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"}
	I0717 17:49:29.468820   39794 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:49:29.470695   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471026   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.471058   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471199   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:49:29.471226   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:49:29.471255   39794 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:29.471268   39794 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:49:29.471282   39794 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:49:29.598374   39794 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:29.598754   39794 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:49:29.599414   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.601913   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602312   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.602351   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602634   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:29.602858   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:29.602888   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:29.603106   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.605092   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605423   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.605446   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605613   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.605754   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.605900   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.606023   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.606203   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.606385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.606396   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:29.714755   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:29.714801   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715040   39794 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:49:29.715065   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715237   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.717642   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.717930   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.717959   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.718110   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.718285   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718413   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718528   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.718679   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.718838   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.718848   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:49:29.840069   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:49:29.840100   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.842822   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843208   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.843233   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843392   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.843581   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843706   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843878   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.844054   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.844256   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.844272   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:29.959423   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:29.959450   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:29.959474   39794 buildroot.go:174] setting up certificates
	I0717 17:49:29.959488   39794 provision.go:84] configureAuth start
	I0717 17:49:29.959495   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.959790   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.962162   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962537   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.962563   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.964777   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965084   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.965116   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965226   39794 provision.go:143] copyHostCerts
	I0717 17:49:29.965266   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965305   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:29.965317   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965397   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:29.965507   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965534   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:29.965544   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965581   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:29.965639   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965671   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:29.965680   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965714   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:29.965774   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:49:30.057325   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:30.057377   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:30.057400   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.059825   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060114   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.060140   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060281   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.060451   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.060561   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.060675   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.146227   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:30.146289   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:30.174390   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:30.174450   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:30.202477   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:30.202541   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0717 17:49:30.229907   39794 provision.go:87] duration metric: took 270.408982ms to configureAuth
	I0717 17:49:30.229929   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:30.230164   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:30.230177   39794 machine.go:97] duration metric: took 627.307249ms to provisionDockerMachine
	I0717 17:49:30.230186   39794 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:49:30.230200   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:30.230227   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.230520   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:30.230554   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.233026   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233363   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.233390   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233521   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.233700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.233828   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.233952   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.318669   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:30.323112   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:30.323131   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:30.323180   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:30.323246   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:30.323258   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:30.323348   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:30.334564   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:30.360407   39794 start.go:296] duration metric: took 130.206138ms for postStartSetup
	I0717 17:49:30.360441   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.360727   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:30.360774   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.362968   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363308   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.363334   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363435   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.363609   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.363749   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.363862   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.448825   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:30.448901   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:30.490930   39794 fix.go:56] duration metric: took 19.126931057s for fixHost
	I0717 17:49:30.490966   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.493716   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494056   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.494081   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494261   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.494473   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494636   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494816   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.495007   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:30.495221   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:30.495236   39794 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0717 17:49:30.611220   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238570.579395854
	
	I0717 17:49:30.611243   39794 fix.go:216] guest clock: 1721238570.579395854
	I0717 17:49:30.611255   39794 fix.go:229] Guest: 2024-07-17 17:49:30.579395854 +0000 UTC Remote: 2024-07-17 17:49:30.49095133 +0000 UTC m=+19.250883626 (delta=88.444524ms)
	I0717 17:49:30.611271   39794 fix.go:200] guest clock delta is within tolerance: 88.444524ms
	I0717 17:49:30.611277   39794 start.go:83] releasing machines lock for "ha-333994", held for 19.24729888s
	I0717 17:49:30.611293   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.611569   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:30.613990   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614318   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.614355   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614483   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.614909   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615067   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615169   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:30.615215   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.615255   39794 ssh_runner.go:195] Run: cat /version.json
	I0717 17:49:30.615275   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.617353   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617676   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.617702   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617734   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617863   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618049   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618146   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.618173   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.618217   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618306   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618370   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.618445   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618555   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618672   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.694919   39794 ssh_runner.go:195] Run: systemctl --version
	I0717 17:49:30.721823   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:49:30.727892   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:30.727967   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:30.745249   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:30.745272   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:30.745332   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:30.784101   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:30.798192   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:30.798265   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:30.811458   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:30.824815   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:30.938731   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:31.081893   39794 docker.go:233] disabling docker service ...
	I0717 17:49:31.081980   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:31.097028   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:31.110328   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:31.242915   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:31.365050   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:31.379135   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:31.400136   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:31.412561   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:31.425082   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:31.425159   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:31.437830   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.450453   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:31.462175   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.473289   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:31.484541   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:31.495502   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:31.506265   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:31.518840   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:31.530158   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:31.530208   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:31.548502   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:31.563431   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:31.674043   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:31.701907   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:31.702006   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:31.706668   39794 retry.go:31] will retry after 920.793788ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:32.627794   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:32.632953   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:32.633009   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:32.636846   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:32.677947   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:32.678013   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.709490   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.738106   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:32.739529   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:32.742040   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742375   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:32.742405   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742590   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:32.746706   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.759433   39794 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingre
ss:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:do
cker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:49:32.759609   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:32.759661   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.792410   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.792432   39794 containerd.go:534] Images already preloaded, skipping extraction
	I0717 17:49:32.792483   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.824536   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.824558   39794 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:49:32.824565   39794 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:49:32.824675   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:32.824722   39794 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:49:32.856864   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:32.856886   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:32.856893   39794 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:49:32.856917   39794 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:49:32.857032   39794 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:49:32.857054   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:32.857090   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:32.875326   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:32.875456   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:32.875511   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:32.885386   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:32.885459   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:49:32.895011   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:49:32.913107   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:32.929923   39794 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:49:32.946336   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:32.962757   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:32.966796   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.979550   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:33.092357   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:33.111897   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:49:33.111921   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:33.111940   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.112113   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:33.112206   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:33.112225   39794 certs.go:256] generating profile certs ...
	I0717 17:49:33.112347   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:33.112383   39794 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1
	I0717 17:49:33.112401   39794 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:49:33.337392   39794 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 ...
	I0717 17:49:33.337432   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1: {Name:mkfeb2a5adc7d732ca48854394be4077f3b9b81e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337612   39794 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 ...
	I0717 17:49:33.337630   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1: {Name:mk17811291d2c587100f8fbd5f0c9c2d641ddf76 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337728   39794 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:49:33.337924   39794 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:49:33.338098   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:33.338134   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:33.338154   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:33.338172   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:33.338188   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:33.338203   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:33.338221   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:33.338239   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:33.338253   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:33.338313   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:33.338354   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:33.338363   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:33.338391   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:33.338431   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:33.338457   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:33.338511   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:33.338549   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.338570   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.338587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.339107   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:33.371116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:33.405873   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:33.442007   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:33.472442   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:33.496116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:33.527403   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:33.552684   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:33.576430   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:33.599936   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:33.623341   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:33.646635   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:49:33.663325   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:33.668872   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:33.679471   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683810   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683866   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.689677   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:33.700471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:33.710911   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715522   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715581   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.721331   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:33.731730   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:33.742074   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746374   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746417   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.751941   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:33.762070   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:33.766344   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0717 17:49:33.771976   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0717 17:49:33.777506   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0717 17:49:33.783203   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0717 17:49:33.788713   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0717 17:49:33.794346   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0717 17:49:33.800031   39794 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:
false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docke
r BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:33.800131   39794 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:49:33.800172   39794 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:49:33.836926   39794 cri.go:89] found id: "86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	I0717 17:49:33.836947   39794 cri.go:89] found id: "dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f"
	I0717 17:49:33.836952   39794 cri.go:89] found id: "5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a"
	I0717 17:49:33.836956   39794 cri.go:89] found id: "f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428"
	I0717 17:49:33.836959   39794 cri.go:89] found id: "0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45"
	I0717 17:49:33.836963   39794 cri.go:89] found id: "2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	I0717 17:49:33.836967   39794 cri.go:89] found id: "d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411"
	I0717 17:49:33.836970   39794 cri.go:89] found id: "2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c"
	I0717 17:49:33.836974   39794 cri.go:89] found id: "5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46"
	I0717 17:49:33.836981   39794 cri.go:89] found id: "515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697"
	I0717 17:49:33.836985   39794 cri.go:89] found id: ""
	I0717 17:49:33.837036   39794 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0717 17:49:33.850888   39794 cri.go:116] JSON = null
	W0717 17:49:33.850933   39794 kubeadm.go:399] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0717 17:49:33.851001   39794 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:49:33.861146   39794 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0717 17:49:33.861164   39794 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0717 17:49:33.861204   39794 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0717 17:49:33.870180   39794 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:49:33.870557   39794 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-333994" does not appear in /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.870654   39794 kubeconfig.go:62] /home/jenkins/minikube-integration/19283-14409/kubeconfig needs updating (will repair): [kubeconfig missing "ha-333994" cluster setting kubeconfig missing "ha-333994" context setting]
	I0717 17:49:33.870894   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.871258   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.871471   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.180:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:49:33.871875   39794 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:49:33.872033   39794 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0717 17:49:33.881089   39794 kubeadm.go:630] The running cluster does not require reconfiguration: 192.168.39.180
	I0717 17:49:33.881107   39794 kubeadm.go:597] duration metric: took 19.938705ms to restartPrimaryControlPlane
	I0717 17:49:33.881113   39794 kubeadm.go:394] duration metric: took 81.089134ms to StartCluster
	I0717 17:49:33.881124   39794 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881175   39794 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.881658   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881845   39794 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:33.881872   39794 start.go:241] waiting for startup goroutines ...
	I0717 17:49:33.881879   39794 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:49:33.882084   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.884129   39794 out.go:177] * Enabled addons: 
	I0717 17:49:33.885737   39794 addons.go:510] duration metric: took 3.853682ms for enable addons: enabled=[]
	I0717 17:49:33.885760   39794 start.go:246] waiting for cluster config update ...
	I0717 17:49:33.885767   39794 start.go:255] writing updated cluster config ...
	I0717 17:49:33.887338   39794 out.go:177] 
	I0717 17:49:33.888767   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.888845   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.890338   39794 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:49:33.891461   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:33.891475   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:33.891543   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:33.891554   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:33.891626   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.891771   39794 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:33.891806   39794 start.go:364] duration metric: took 19.128µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:49:33.891819   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:33.891826   39794 fix.go:54] fixHost starting: m02
	I0717 17:49:33.892056   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:33.892076   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:33.906264   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44047
	I0717 17:49:33.906599   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:33.907064   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:33.907083   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:33.907400   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:33.907566   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:33.907713   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:49:33.909180   39794 fix.go:112] recreateIfNeeded on ha-333994-m02: state=Stopped err=<nil>
	I0717 17:49:33.909199   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	W0717 17:49:33.909338   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:33.911077   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994-m02" ...
	I0717 17:49:33.912122   39794 main.go:141] libmachine: (ha-333994-m02) Calling .Start
	I0717 17:49:33.912246   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:49:33.912879   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:49:33.913156   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:49:33.913539   39794 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:49:33.914190   39794 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:49:35.092192   39794 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:49:35.092951   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.093269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.093360   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.093273   39957 retry.go:31] will retry after 192.383731ms: waiting for machine to come up
	I0717 17:49:35.287679   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.288078   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.288104   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.288046   39957 retry.go:31] will retry after 385.654698ms: waiting for machine to come up
	I0717 17:49:35.675666   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.676036   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.676064   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.675991   39957 retry.go:31] will retry after 420.16772ms: waiting for machine to come up
	I0717 17:49:36.097264   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.097632   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.097689   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.097608   39957 retry.go:31] will retry after 593.383084ms: waiting for machine to come up
	I0717 17:49:36.692388   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.692779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.692805   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.692748   39957 retry.go:31] will retry after 522.894623ms: waiting for machine to come up
	I0717 17:49:37.217539   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.217939   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.217974   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.217901   39957 retry.go:31] will retry after 618.384823ms: waiting for machine to come up
	I0717 17:49:37.837779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.838175   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.838200   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.838142   39957 retry.go:31] will retry after 1.091652031s: waiting for machine to come up
	I0717 17:49:38.931763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:38.932219   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:38.932247   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:38.932134   39957 retry.go:31] will retry after 1.341674427s: waiting for machine to come up
	I0717 17:49:40.275320   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:40.275792   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:40.275820   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:40.275754   39957 retry.go:31] will retry after 1.293235927s: waiting for machine to come up
	I0717 17:49:41.571340   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:41.571705   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:41.571732   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:41.571661   39957 retry.go:31] will retry after 1.542371167s: waiting for machine to come up
	I0717 17:49:43.115333   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:43.115796   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:43.115826   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:43.115760   39957 retry.go:31] will retry after 1.886589943s: waiting for machine to come up
	I0717 17:49:45.004358   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:45.004727   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:45.004763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:45.004693   39957 retry.go:31] will retry after 2.72551249s: waiting for machine to come up
	I0717 17:49:47.733475   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:47.733874   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:47.733902   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:47.733829   39957 retry.go:31] will retry after 3.239443396s: waiting for machine to come up
	I0717 17:49:50.975432   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.975912   39794 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:49:50.975930   39794 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:49:50.975960   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.976436   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.976461   39794 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:49:50.976480   39794 main.go:141] libmachine: (ha-333994-m02) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"}
	I0717 17:49:50.976499   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:49:50.976514   39794 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:49:50.978829   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979226   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.979246   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979387   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:49:50.979411   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:49:50.979431   39794 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:50.979444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:49:50.979455   39794 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:49:51.106070   39794 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:51.106413   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:49:51.106973   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.109287   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.109618   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109826   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:51.110023   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:51.110040   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.110237   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.112084   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112321   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.112346   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112436   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.112578   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112724   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112869   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.113027   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.113194   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.113205   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:51.214365   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:51.214388   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214600   39794 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:49:51.214629   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.217146   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217465   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.217489   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217600   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.217758   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.217934   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.218049   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.218223   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.218385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.218401   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:49:51.334279   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:49:51.334317   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.337581   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.337905   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.337933   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.338139   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.338346   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338512   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338693   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.338845   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.339025   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.339046   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:51.454925   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:51.454956   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:51.454978   39794 buildroot.go:174] setting up certificates
	I0717 17:49:51.454987   39794 provision.go:84] configureAuth start
	I0717 17:49:51.454999   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.455257   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.457564   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.457851   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.457873   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.458013   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.459810   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460165   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.460190   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460306   39794 provision.go:143] copyHostCerts
	I0717 17:49:51.460327   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460352   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:51.460360   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460411   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:51.460474   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460493   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:51.460497   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460514   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:51.460556   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460571   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:51.460577   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460593   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:51.460641   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:49:51.635236   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:51.635286   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:51.635308   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.638002   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638369   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.638395   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638622   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.638815   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.638982   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.639145   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.720405   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:51.720478   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:51.746352   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:51.746412   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:49:51.770628   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:51.770702   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:51.795258   39794 provision.go:87] duration metric: took 340.256082ms to configureAuth
	I0717 17:49:51.795284   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:51.795490   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:51.795501   39794 machine.go:97] duration metric: took 685.467301ms to provisionDockerMachine
	I0717 17:49:51.795514   39794 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:49:51.795528   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:51.795563   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.795850   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:51.795874   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.798310   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798696   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.798719   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798889   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.799047   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.799191   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.799286   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.881403   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:51.885516   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:51.885542   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:51.885603   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:51.885687   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:51.885697   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:51.885773   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:51.894953   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:51.919442   39794 start.go:296] duration metric: took 123.913575ms for postStartSetup
	I0717 17:49:51.919487   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.919775   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:51.919801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.922159   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922506   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.922533   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922672   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.922878   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.923036   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.923152   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.004408   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:52.004481   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:52.063014   39794 fix.go:56] duration metric: took 18.171175537s for fixHost
	I0717 17:49:52.063061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.065858   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066239   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.066269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066459   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.066648   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066806   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066931   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.067086   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:52.067288   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:52.067303   39794 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0717 17:49:52.166802   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238592.140235525
	
	I0717 17:49:52.166826   39794 fix.go:216] guest clock: 1721238592.140235525
	I0717 17:49:52.166835   39794 fix.go:229] Guest: 2024-07-17 17:49:52.140235525 +0000 UTC Remote: 2024-07-17 17:49:52.063042834 +0000 UTC m=+40.822975139 (delta=77.192691ms)
	I0717 17:49:52.166849   39794 fix.go:200] guest clock delta is within tolerance: 77.192691ms
	I0717 17:49:52.166853   39794 start.go:83] releasing machines lock for "ha-333994-m02", held for 18.275039229s
	I0717 17:49:52.166873   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.167105   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:52.169592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.169924   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.169948   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.172181   39794 out.go:177] * Found network options:
	I0717 17:49:52.173607   39794 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:49:52.174972   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.175003   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175597   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175781   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175858   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:52.175897   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:49:52.175951   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.176007   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:49:52.176024   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.178643   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.178748   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179072   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179098   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179230   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179248   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179272   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179432   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179524   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179596   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179664   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179721   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179794   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.179844   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:49:52.256371   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:52.256433   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:52.287825   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:52.287848   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:52.287901   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:52.316497   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:52.330140   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:52.330189   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:52.343721   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:52.357273   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:52.483050   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:52.682504   39794 docker.go:233] disabling docker service ...
	I0717 17:49:52.682571   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:52.702383   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:52.717022   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:52.851857   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:52.989407   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:53.003913   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:53.024876   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:53.035470   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:53.046129   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:53.046184   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:53.056553   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.067211   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:53.077626   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.088680   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:53.100371   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:53.111920   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:53.123072   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:53.133713   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:53.143333   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:53.143405   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:53.157890   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:53.167934   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:53.302893   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:53.333425   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:53.333488   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:53.339060   39794 retry.go:31] will retry after 1.096332725s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:54.435963   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:54.441531   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:54.441599   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:54.445786   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:54.483822   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:54.483877   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.518845   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.553079   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:54.554649   39794 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:49:54.556061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:54.559046   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559422   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:54.559444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559695   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:54.564470   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:54.579269   39794 mustload.go:65] Loading cluster: ha-333994
	I0717 17:49:54.579483   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:54.579765   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.579792   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.594439   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39001
	I0717 17:49:54.594883   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.595350   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.595374   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.595675   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.595858   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:54.597564   39794 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:49:54.597896   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.597921   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.613634   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34405
	I0717 17:49:54.614031   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.614493   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.614511   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.614816   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.615002   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:54.615153   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:49:54.615165   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:54.615183   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:54.615314   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:54.615354   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:54.615363   39794 certs.go:256] generating profile certs ...
	I0717 17:49:54.615452   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:54.615493   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:49:54.615524   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:54.615535   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:54.615548   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:54.615560   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:54.615575   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:54.615587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:54.615599   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:54.615635   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:54.615651   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:54.615692   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:54.615716   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:54.615731   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:54.615754   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:54.615774   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:54.615795   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:54.615829   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:54.615854   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:54.615866   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:54.615877   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:54.615902   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:54.618791   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619169   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:54.619191   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619351   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:54.619524   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:54.619660   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:54.619789   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:54.694549   39794 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0717 17:49:54.699693   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:49:54.711136   39794 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0717 17:49:54.715759   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:49:54.727707   39794 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:49:54.732038   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:49:54.743206   39794 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:49:54.747536   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:49:54.759182   39794 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:49:54.763279   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:49:54.774195   39794 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0717 17:49:54.778345   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:49:54.790000   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:54.817482   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:54.842528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:54.867521   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:54.893528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:54.920674   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:54.946673   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:54.972385   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:54.997675   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:55.023298   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:55.048552   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:55.073345   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:49:55.091193   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:49:55.108383   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:49:55.125529   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:49:55.142804   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:49:55.160482   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:49:55.178995   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:49:55.197026   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:55.202998   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:55.214662   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219373   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219447   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.225441   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:55.236543   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:55.247672   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252336   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252396   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.258207   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:55.269215   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:55.280136   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284763   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284843   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.290471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:55.301174   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:55.305201   39794 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:49:55.305253   39794 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:49:55.305343   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:55.305377   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:55.305412   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:55.322820   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:55.322885   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:55.322938   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:55.332945   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:55.333009   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0717 17:49:55.342555   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0717 17:49:55.358883   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:55.375071   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:55.393413   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:55.397331   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:55.411805   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.535806   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:55.554620   39794 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:55.554913   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:55.556751   39794 out.go:177] * Verifying Kubernetes components...
	I0717 17:49:55.558066   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.748334   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:56.613699   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:56.613920   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0717 17:49:56.613970   39794 kubeadm.go:483] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.180:8443
	I0717 17:49:56.614170   39794 node_ready.go:35] waiting up to 6m0s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:49:56.614265   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:56.614272   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:56.614280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:56.614286   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:56.627325   39794 round_trippers.go:574] Response Status: 404 Not Found in 13 milliseconds
	I0717 17:49:57.115057   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.115083   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.115091   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.115095   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.117582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:57.614333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.614354   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.614362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.614365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.616581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.117636   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.615397   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.615423   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.615434   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.617780   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.617919   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:49:59.114753   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.114774   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.116989   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:59.615261   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.615289   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.615299   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.615305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.617539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.115327   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.115348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.115356   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.115359   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.117595   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.615335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.615371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:01.115332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.115360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.115364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.118462   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:01.118555   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:01.614396   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.614425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.614429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.616688   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.115381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.115413   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.115424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.115429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.117845   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.614519   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.616973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.114666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.114690   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.114706   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.114711   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.614478   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.614500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.614508   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.616763   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.616861   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:04.115079   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.115103   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.115116   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.117400   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:04.614899   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.614932   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.614936   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.617138   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.115001   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.115024   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.115031   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.115039   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.117375   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.615153   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.615158   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.617472   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.617581   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:06.115206   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.115235   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.115240   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.117694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:06.614430   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.614453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.614462   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.614467   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.616849   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.115357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.115378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.115386   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.117909   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.614460   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.614497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.617064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.115383   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.115405   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.115412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.115417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.117848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.117947   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:08.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.614415   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.614423   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.616608   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.114929   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.114950   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.114958   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.114962   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.117409   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.614639   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.614659   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.614666   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.614670   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.616904   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.114644   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.114668   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.114676   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.114685   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.117224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.614973   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.614995   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.615003   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.615007   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.617474   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:11.115160   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.115187   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.115197   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.117916   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:11.615031   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.615053   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.615061   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.615065   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.617581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.115275   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.115297   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.115305   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.615329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.615367   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.617808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.617929   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:13.114465   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.114488   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.114497   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.114501   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.116973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:13.614674   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.614704   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.614715   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.614721   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.617161   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.115360   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.117798   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.615028   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.615052   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.615062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.615068   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.617174   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.115117   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.115140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.115149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.115154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.117832   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.117958   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:15.614474   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.614528   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.614534   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.114493   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.114529   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.114536   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.114540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.117140   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.614895   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.614935   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.614943   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.617847   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.114480   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.114500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.114510   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.116841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.614484   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.614505   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.614512   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.614515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.616877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.617049   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:18.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.115346   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.115354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.115358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.117690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:18.614346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.614364   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.614372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.614377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.617203   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.114315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.114349   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.114357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.114362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.119328   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:19.614516   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.614536   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.614544   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.614549   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.616974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.617173   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:20.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.114896   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.114905   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.114908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.117228   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:20.614953   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.614974   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.614981   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.614987   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.617553   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.115256   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.115288   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.115297   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.117516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.614470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.614493   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.614504   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.616801   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.114458   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.114481   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.114491   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.114497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.116704   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.116814   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:22.614361   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.614383   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.614395   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.616868   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.117765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.614469   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.614480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.614486   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.616902   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.115254   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.115277   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.115287   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.115292   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.117319   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.117422   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:24.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.614655   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.614665   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.614669   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.617182   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:25.115401   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.115430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.115434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.118835   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:25.614325   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.614351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.614366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.616764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.114413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.114451   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.114460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.117000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.614789   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.614815   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.614826   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.614831   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.617192   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.617279   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:27.114863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.114888   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.114897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.114903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.117792   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:27.615352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.615378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.615389   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.615394   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.618057   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.115330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.115353   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.115365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.117820   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.615355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.615377   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.615385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.615389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.619637   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:28.619765   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:29.114706   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.114727   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.114734   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.114738   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.117064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:29.614803   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.614826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.614835   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.617436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.114527   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.114565   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.614542   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.614551   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.614554   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.114819   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.114856   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.114867   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.114873   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.117237   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.117345   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:31.615179   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.615203   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.615219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.615224   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.617525   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.115329   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.115337   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.115341   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.117639   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.614391   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.614403   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.617172   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.115127   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.115162   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.117796   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.117911   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:33.614544   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.614586   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.614597   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.614611   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.616706   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.115175   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.115197   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.117345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.614352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.614380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.614384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.616826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.114840   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.114867   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.114876   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.114881   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.117298   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.615140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.617897   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:36.115372   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.115393   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.115402   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.115405   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.117735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:36.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.615376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.615388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.617891   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:37.114533   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.114567   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.114572   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:37.615384   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.615406   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.615414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.615417   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.617760   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.114425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.114448   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.114455   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.114458   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.117016   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.117135   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:38.614755   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.614779   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.614787   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.614790   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.617099   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.115282   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.115303   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.115311   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.115315   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.117895   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.614832   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.614853   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.614865   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.617355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.115339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.115361   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.115369   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.117661   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:40.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.614396   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.616881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.114581   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.114606   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.114616   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.114622   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.116877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.614884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.614906   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.614914   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.614919   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.115181   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.115193   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.115201   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.117819   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:42.614328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.614348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.614356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.617382   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:43.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.115127   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.115135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.115140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.117355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:43.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.615142   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.617549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.114826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.114834   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.114839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.117204   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.615431   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.615439   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.617856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.617969   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:45.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.115093   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.117220   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:45.614988   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.615008   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.615015   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.615018   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.617421   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.115178   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.615076   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.617407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.117871   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.117975   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:47.614555   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.614577   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.614586   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.617103   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.114743   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.116997   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.614683   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.614710   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.614721   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.614734   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.617185   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.115307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.115332   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.115343   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.115347   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.117646   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.614838   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.614858   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.614872   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.614880   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.617342   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.617440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:50.115333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.115372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.117536   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:50.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.615270   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.615278   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.615282   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.617747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.114366   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.114389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.114396   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.114400   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.116597   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.614397   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.614401   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.616747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.114431   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.114453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.114461   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.117470   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:52.615088   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.615111   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.615118   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.615122   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.617416   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.115203   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.115208   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.117683   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.614356   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.614376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.614384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.614388   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.616703   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.114990   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.115013   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.115020   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.115024   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.117855   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.117941   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:54.615104   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.615125   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.615135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.615140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.114983   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.115012   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.117396   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.615131   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.615152   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.615168   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.617453   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.115180   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.115201   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.115209   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.117326   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.615051   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.615074   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.615082   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.615087   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.617369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.617480   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:57.115080   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.115102   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.115110   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.115114   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.117510   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:57.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.615246   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.615254   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.615258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.617511   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.114811   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.117265   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.614995   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.615015   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.615023   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.615028   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.617145   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.115342   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.115350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.115353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.117772   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.117893   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:59.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.614906   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.617194   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.115270   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.115304   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.117653   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.615391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.617720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.114385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.114413   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.114416   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.116717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.614708   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.614735   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.614745   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.614751   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.617211   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.617309   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:02.114916   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.114948   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.114956   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.114965   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.117244   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:02.614964   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.614987   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.614995   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.614999   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.617512   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.115239   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.115251   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.117907   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.614525   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.614547   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.614557   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.614561   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.621322   39794 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0717 17:51:03.621424   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:04.114491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.114513   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.116543   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:04.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.614699   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.614705   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.616831   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.114969   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.114996   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.115003   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.115008   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.117465   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.615208   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.615240   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.617689   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.114340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.114360   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.114368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.114372   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.116445   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.116590   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:06.615129   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.615165   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.615172   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.115324   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.117841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.614530   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.614557   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.614566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.614570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.617073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.114714   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.114750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.114756   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.117056   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.117161   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:08.615333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.615360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.617848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:09.114938   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.114965   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.114974   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.114980   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.118060   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:09.615157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.615177   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.615192   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.617894   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.115084   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.115104   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.115112   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.115120   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.117391   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.117508   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:10.615120   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.615155   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.615161   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.617842   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.114485   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.114507   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.114515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.114520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.117245   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.615400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.615426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.615437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.617790   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.115351   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.117803   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.117915   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:12.614461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.614485   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.614495   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.614500   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.617208   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.114980   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.115020   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.117385   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.615122   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.615160   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.615166   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.617805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.115212   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.115244   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.115253   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.115258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.117528   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.614681   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.614701   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.614711   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.614717   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.617113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.617211   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:15.115267   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.115291   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.115302   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.115309   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.117537   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:15.615307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.615331   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.615340   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.615345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.617660   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.115400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.115426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.115437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.115444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.118040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.614698   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.614703   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.617162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.617258   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:17.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.114853   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.114868   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.117547   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:17.615274   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.615316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.615323   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.617344   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.115064   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.115086   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.115097   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.115101   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.117232   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.614999   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.615021   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.615032   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.615037   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.617285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.617392   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:19.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.114451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.117257   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:19.615315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.615344   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.617155   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:20.115264   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.115284   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.115292   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.115296   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.117412   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:20.615133   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.615162   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.615165   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.616967   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:21.114603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.114639   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.114648   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.114655   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.116866   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:21.116957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:21.614816   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.614841   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.614850   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.614854   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.115139   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.115162   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.115170   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.115174   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.614412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.614434   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.614440   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.614444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.617178   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.114352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.114377   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.114388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.114392   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.116563   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.615345   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.615372   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.615380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.618002   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.618112   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:24.115378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.115401   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.115418   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.117758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:24.614891   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.614912   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.614926   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.617332   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.115412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.115436   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.115445   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.115448   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.117910   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.614339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.614363   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.614371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.614375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.617451   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:26.115183   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.115207   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.115219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.115225   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.117163   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:26.117274   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:26.614942   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.614966   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.614977   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.614984   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.617676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.115347   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.115370   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.115380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.117861   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.615350   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.615359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.618250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.114551   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.114569   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.114577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.114583   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.117333   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.117440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:28.615148   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.615180   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.615191   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.615196   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:29.114764   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.114789   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.114800   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.114804   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:29.615144   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.615168   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.615180   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.615195   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.114670   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.114678   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.116515   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:30.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.615265   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.615273   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.617998   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.618150   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:31.115373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.115395   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.115403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.115407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.117657   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:31.614754   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.614781   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.614789   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.616938   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.115334   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.115357   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.115370   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.117890   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.614529   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.614551   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.614559   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.614563   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.617063   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.114739   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.114762   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.114769   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.114773   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.116876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.116968   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:33.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.614566   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.614574   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.614579   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.616992   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.115382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.115403   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.115414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.117715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.614863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.614888   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.617243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.115352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.115375   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.117853   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.117957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:35.614511   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.614533   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.614541   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.614547   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.617000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.114661   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.114682   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.114690   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.114695   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.117055   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.614908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.617081   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.114747   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.117323   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.615075   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.617571   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.617677   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:38.115271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.117337   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:38.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.615136   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.615143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.615146   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.617524   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.114717   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.114726   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.114731   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.116906   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.615059   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.615078   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.615086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.615090   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:40.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.114645   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.114655   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.114659   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.116637   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:40.116742   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:40.615346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.615368   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.615379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.615385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.617774   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.114470   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.114474   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.116924   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.614862   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.614882   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.614890   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.617121   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.114844   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.114871   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.114880   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.114887   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.117456   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.117549   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:42.615184   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.615219   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.615228   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.615231   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.617697   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.115377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.117888   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.614542   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.614564   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.614572   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.614575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.617156   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.114390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.114418   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.114430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.116806   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.614781   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.614808   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.614813   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.616969   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.617103   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:45.115008   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.115031   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.117431   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:45.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.615252   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.615262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.615266   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.617533   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.115209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.115230   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.115243   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.118193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.614898   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.614921   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.614928   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.614932   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.617234   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.617429   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:47.115009   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.115032   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.117484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:47.615213   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.615236   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.615245   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.615249   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.617602   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.115343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.117939   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.614599   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.614625   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.617112   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.117738   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.117854   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:49.614434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.614465   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.614475   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.614479   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.617641   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:50.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.115370   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.117407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:50.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.615340   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.615353   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.114398   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.114407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.114414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.116810   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.614799   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.614831   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.614844   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.617260   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.617398   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:52.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.115094   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.115102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.115108   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.117538   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:52.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.615365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.617834   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.114486   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.114512   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.118242   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:53.615003   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.615034   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.615045   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.615051   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.617826   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:54.115063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.115091   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.115100   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.115105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.117425   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:54.615271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.615304   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.615309   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.617987   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:55.115096   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.115119   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.115127   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.115131   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:55.614857   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.614897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.614903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.617711   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.115361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.118008   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.118139   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:56.614719   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.614745   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.614752   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.614756   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.617529   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.115310   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.115318   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.115321   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.117714   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.614495   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.614525   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.614528   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.616925   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.114573   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.114598   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.114609   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.114613   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.116783   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.614459   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.614476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.616956   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:59.115030   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.115055   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.115066   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.115073   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:59.615128   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.615151   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.615159   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.615164   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.617627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.114672   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.114694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.114702   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.114706   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.117073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.614975   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.614999   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.615009   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.615014   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.617143   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.617251   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:01.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.114842   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.114852   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.114858   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.117434   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:01.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.614440   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.614448   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.617018   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.114715   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.114722   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.114727   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.116963   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.614625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.614650   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.614660   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.614664   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.617042   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.114775   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.116932   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.117041   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:03.614597   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.614618   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.614630   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.616748   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.115018   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.115039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.115049   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.115053   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.117556   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.615368   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.617694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.114830   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.114857   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.114865   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.114869   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.117278   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.117380   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:05.615000   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.615035   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.615052   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.617339   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.115037   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.115056   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.115062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.115066   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.117588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.614309   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.614333   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.614341   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.614346   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.616516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.115312   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.115336   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.115345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.115349   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.117714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:07.615376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.615398   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.615406   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.617826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.114477   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.114499   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.114511   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.116889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.614611   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.614649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.115169   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.115191   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.117574   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.615328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.615357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.615361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.617889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.618007   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:10.115232   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.115254   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.115262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.115268   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.117721   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:10.614358   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.614381   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.614388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.616539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.115338   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.115377   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.115384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.117600   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.614501   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.614525   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.614535   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.614539   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.616883   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.114544   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.114552   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.114557   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.117075   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.117189   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:12.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.614866   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.617132   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.114797   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.114818   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.114830   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.114835   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.117193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.614859   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.614880   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.614887   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.614891   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.617224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.114680   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.114701   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.114708   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.114713   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.117640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:14.615371   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.615412   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.617899   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.115307   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.115316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.115320   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.615379   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.615407   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.617678   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.115368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.117508   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.615332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.615355   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.617762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.617852   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:17.115342   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.115380   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.117745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:17.614381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.614404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.614411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.614414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.616676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:18.114344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.114365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.114372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.114377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.116126   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:18.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.614859   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.614863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.617249   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.114382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.114404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.114422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.116549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.116667   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:19.615132   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.615157   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.615166   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.617897   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:20.115394   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.115438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.120626   39794 round_trippers.go:574] Response Status: 404 Not Found in 5 milliseconds
	I0717 17:52:20.615314   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.615343   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.617815   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.114476   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.114497   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.114509   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.114516   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.116694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.116789   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:21.614568   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.614590   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.614596   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.614600   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.616740   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.114465   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.114477   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.116620   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.615373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.615414   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.615422   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.615425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.115377   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.115390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.117793   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.117961   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:23.614462   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.614495   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.616758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.115153   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.115174   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.115183   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.115187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.117485   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.615251   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.615278   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.615294   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.618155   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.114648   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.114656   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.114660   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.117162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.614843   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.614863   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.614871   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.614875   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.617057   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:26.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.114665   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.114677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.116743   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:26.614490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.614512   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.614521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.614524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.616812   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.115366   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.115379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.117751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.614385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.614429   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.614436   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.614440   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.616766   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.114438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.114476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.116881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.116995   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:28.614550   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.614573   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.614583   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.616576   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:29.114665   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.114688   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.114697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.114701   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.116949   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:29.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.614647   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.614652   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.617229   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.114692   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.114711   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.114725   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.116453   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:30.615200   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.615233   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.615241   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.617947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.618078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:31.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.114663   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.114674   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.114677   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.116821   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:31.614807   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.614849   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.614857   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.617107   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.114733   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.114780   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.114784   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.117117   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.614873   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.614906   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.614913   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.617084   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.114774   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.116968   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.117056   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:33.614614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.614634   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.614642   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.614648   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.616694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.114989   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.115010   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.115019   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.115023   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.117256   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.615017   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.615039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.615049   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.617305   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.114707   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.114729   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.114737   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.114741   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.116837   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.617169   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.617264   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:36.114880   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.114903   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.114912   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.114915   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.117413   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:36.615154   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.615178   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.615189   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.617681   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.114404   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.114427   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.114438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.116709   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.614444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.614452   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.614465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.616814   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.114566   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.117012   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.117111   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:38.614715   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.614738   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.614746   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.614750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.115300   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.115321   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.115330   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.117647   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.615387   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.615412   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.615418   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.615422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.617840   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.114520   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.114548   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.114553   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.116874   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.614642   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.614667   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.614677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.614682   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.617201   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.617299   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:41.114884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.114913   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.114925   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.114930   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.117705   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:41.614760   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.614784   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.614799   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.617304   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.115055   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.115077   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.115086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.115092   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.117464   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.615207   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.617906   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:43.114443   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.114471   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.114484   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.114489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.116804   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:43.614503   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.614534   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.614546   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.616923   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.114333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.114362   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.114371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.114376   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.116593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.615353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.615375   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.615383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.615387   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.619020   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:44.619252   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:45.114535   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.114558   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.114565   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.114568   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.116805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:45.614455   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.614477   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.614485   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.614489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.616531   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.115327   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.115340   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.117430   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.615358   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.615364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.617638   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.115375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.115397   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.115405   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.115410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.117966   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.118069   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:47.614605   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.614627   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.614635   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.617373   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.115142   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.115164   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.115173   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.115177   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.117353   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.615075   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.615097   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.615105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.617317   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.114470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.114492   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.114501   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.114506   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.116813   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.617717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.617816   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:50.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.115376   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.115384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.115389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.117802   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:50.614440   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.614462   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.614474   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.616542   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:51.115295   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.115318   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.115325   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.115329   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.118739   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:51.614657   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.614694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.614703   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.614708   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.616892   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.114541   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.114568   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.114575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.114578   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.117054   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.117156   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:52.614718   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.614748   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.614759   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.614765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.114959   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.114984   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.114996   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.115000   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.117274   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.615035   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.615060   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.615070   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.617250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.114679   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.114686   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.114694   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.116952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.614585   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.614604   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.614612   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.614615   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.616959   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.617087   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:55.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.114543   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.114550   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.114556   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.117176   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:55.614804   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.614830   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.614843   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.614848   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.114710   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.114750   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.114757   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.117352   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.615042   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.615064   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.615072   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.617503   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.617629   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:57.115247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.115273   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.115283   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.115289   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.119157   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:57.614778   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.614808   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.614812   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.617771   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.114423   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.114444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.114451   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.114455   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.116940   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.614594   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.614616   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.614631   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.616901   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.114914   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.114934   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.114942   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.114945   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.117144   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.117235   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:59.614791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.614814   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.614822   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.614827   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.617115   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.115354   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.115366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.117649   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.615378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.615400   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.615411   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.615416   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.617719   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.114375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.114397   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.114404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.114408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.614966   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.614991   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.615002   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.615011   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.618973   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:01.619078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:02.114685   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.114710   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.114723   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:02.615258   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.615281   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.615293   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.115355   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.115371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.117667   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.615340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.615365   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.615374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.615379   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.617818   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.115204   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.115234   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.115238   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.117764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.117866   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:04.615339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.615357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.617952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.114451   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.114472   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.114480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.114484   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.116809   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.614454   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.614475   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.614482   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.614487   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.616856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.114549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.114553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.117433   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.615116   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.615137   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.615145   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.615149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.617328   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.617423   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:07.115073   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.115096   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.115109   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.117243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:07.614957   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.614980   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.614988   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.614992   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.617455   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.115203   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.115228   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.115237   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.115242   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.117953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.614601   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.614621   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.614627   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.614632   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.616977   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.115171   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.115192   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.115200   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.115204   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.117505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.117620   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:09.615237   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.615259   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.615266   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.615270   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.617567   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.115157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.115180   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.115188   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.115191   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.117490   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.615247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.615277   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.615280   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.618489   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.115353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.115382   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.118557   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.118654   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:11.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.614439   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.616736   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.114441   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.114467   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.114475   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.114479   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.615359   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.615390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.617471   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.115196   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.115221   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.115230   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.115235   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.117548   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.615239   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.615269   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.615279   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.615285   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.617765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.617868   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:14.115201   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.115222   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.115230   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.118205   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:14.614910   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.614930   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.614941   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.614946   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.617345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.114915   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.114940   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.114953   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.114959   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.117285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.615063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.615091   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.615102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.617892   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:16.114326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.114345   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.114353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.114358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.116687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:16.614425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.614445   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.614456   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.614463   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.115235   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.115266   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.115275   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.115281   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.117592   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.615370   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.615394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.615403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.115421   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.115449   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.115460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.115466   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.117540   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.117666   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:18.615244   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.615280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.615285   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.617069   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:19.115249   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.115272   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.115282   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.115288   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:19.614391   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.614427   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.614435   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.614439   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.616687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:20.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.115251   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.115255   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.119958   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:53:20.120050   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:20.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.614641   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.614651   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.617751   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:21.115329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.115350   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.118322   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:21.615343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.615364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.615373   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.615376   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.617662   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.114307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.114356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.114367   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.114373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.614436   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.614447   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.614452   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.616582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.616699   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:23.115301   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.115323   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.115331   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.115335   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.117744   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:23.614413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.614437   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.616559   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.115103   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.115133   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.115147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.117693   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.614569   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.614577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.614581   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.617065   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.617179   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:25.114461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.114485   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.114493   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.114496   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.116786   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:25.614416   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.614438   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.614446   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.616751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.114388   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.114410   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.114421   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.116745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.614603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.614626   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.617102   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.617207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:27.114783   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.114807   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.114818   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.114826   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.117702   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:27.614374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.614412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.614420   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.614425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.115250   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.115254   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.615342   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.615354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.617775   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.617869   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:29.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.114893   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.114901   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.114907   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:29.615278   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.615300   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.615308   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.615313   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.617690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.117881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.614571   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.614593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.614601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.614605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.617497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.115240   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.115252   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.117580   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.117691   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:31.614491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.614514   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.614520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.614525   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.616752   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.114434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.114457   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.114465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.114469   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.116843   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.614510   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.614531   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.614537   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.614540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.617151   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.114852   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.114859   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.117627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.117746   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:33.615336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.617473   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.114732   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.117561   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.615316   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.615341   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.615351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.615356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.618153   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.114569   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.114593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.114601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.114605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.116953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.614348   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.614383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.614389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.617237   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:36.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.114812   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.117593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:36.615382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.615417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.618040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.114722   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.114753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.114761   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.114765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.116947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.614643   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.614686   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.614697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.614702   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.616876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.114536   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.114566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.114570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.117369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.117462   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:38.615126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.615156   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.615160   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.115081   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.115113   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.115122   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.115126   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.117948   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.614647   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.614659   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.614665   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.617484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.115131   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.115149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.117287   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.615033   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.615059   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.615071   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.617572   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.617676   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:41.115286   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.115309   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.115316   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.115321   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.117762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:41.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.614734   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.614743   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.614747   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.617493   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.115269   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.115292   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.115303   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.117720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.614427   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.614434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.616931   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.115385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.115412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.115425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.118066   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.118207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:43.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.614753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.614765   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.614770   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.617067   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.114374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.114406   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.114415   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.114419   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.116619   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.615405   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.617626   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.115126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.115163   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.117350   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.615112   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.615135   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.615142   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.615147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.617618   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.617714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:46.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:46.615363   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.615386   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.615394   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.615398   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.617675   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.114336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.114357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.114365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.114369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.116450   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.615209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.615232   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.615248   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.617669   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.617889   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:48.114456   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.114479   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.114488   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.114491   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.116715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:48.614390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.614424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.616735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.114828   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.114850   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.114858   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.114863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.117111   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.614976   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.614997   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.615005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.615010   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.617505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.115004   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.115026   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.115033   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.115038   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.117441   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:50.615143   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.615170   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.615179   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.615187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.617427   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.115170   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.115193   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.115205   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.117289   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.615380   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.615419   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.618038   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.114724   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.114760   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.114773   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.117189   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.614887   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.614911   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.614927   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.617222   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.617335   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:53.114967   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.114994   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.115005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.115013   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.117578   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:53.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.614394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.614404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.614412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.617467   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:54.114883   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.114906   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.114915   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.114921   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.117603   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.615330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.615353   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.618101   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.618221   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:55.114614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.114640   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.114649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.114656   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.117436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:55.615236   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.615260   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.615270   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.617974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.114490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.114511   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.114524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.117090   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.614907   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.614932   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.614943   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.614948   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.618676   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:56.618791   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:56.618808   39794 node_ready.go:38] duration metric: took 4m0.004607374s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:53:56.620932   39794 out.go:177] 
	W0717 17:53:56.622268   39794 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0717 17:53:56.622282   39794 out.go:239] * 
	* 
	W0717 17:53:56.623241   39794 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:53:56.625101   39794 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:469: failed to run minikube start. args "out/minikube-linux-amd64 node list -p ha-333994 -v=7 --alsologtostderr" : exit status 80
ha_test.go:472: (dbg) Run:  out/minikube-linux-amd64 node list -p ha-333994
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/RestartClusterKeepsNodes FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/RestartClusterKeepsNodes]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.768952036s)
helpers_test.go:252: TestMultiControlPlane/serial/RestartClusterKeepsNodes logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:37 UTC | 17 Jul 24 17:37 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node start m02 -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-333994 -v=7               | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:46 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-333994 -v=7                    | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:46 UTC | 17 Jul 24 17:49 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-333994 --wait=true -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:49 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-333994                    | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:53 UTC |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:49:11
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:49:11.274843   39794 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:49:11.274995   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275005   39794 out.go:304] Setting ErrFile to fd 2...
	I0717 17:49:11.275011   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275192   39794 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:49:11.275748   39794 out.go:298] Setting JSON to false
	I0717 17:49:11.276624   39794 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":5494,"bootTime":1721233057,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:49:11.276685   39794 start.go:139] virtualization: kvm guest
	I0717 17:49:11.279428   39794 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:49:11.280920   39794 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:49:11.280939   39794 notify.go:220] Checking for updates...
	I0717 17:49:11.284081   39794 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:49:11.285572   39794 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:11.286973   39794 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:49:11.288259   39794 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:49:11.289617   39794 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:49:11.291360   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:11.291471   39794 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:49:11.291860   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.291910   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.306389   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41441
	I0717 17:49:11.306830   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.307340   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.307365   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.307652   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.307877   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.342518   39794 out.go:177] * Using the kvm2 driver based on existing profile
	I0717 17:49:11.343905   39794 start.go:297] selected driver: kvm2
	I0717 17:49:11.343922   39794 start.go:901] validating driver "kvm2" against &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false
ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.344074   39794 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:49:11.344385   39794 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.344460   39794 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:49:11.359473   39794 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:49:11.360126   39794 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:49:11.360191   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:11.360203   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:11.360258   39794 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false i
stio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.360356   39794 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.362215   39794 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:49:11.363497   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:11.363528   39794 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:49:11.363538   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:11.363621   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:11.363633   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:11.363751   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:11.363927   39794 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:11.363968   39794 start.go:364] duration metric: took 23.038µs to acquireMachinesLock for "ha-333994"
	I0717 17:49:11.363985   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:11.363995   39794 fix.go:54] fixHost starting: 
	I0717 17:49:11.364238   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.364269   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.378515   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45003
	I0717 17:49:11.378994   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.379458   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.379478   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.379772   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.379977   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.380153   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:11.381889   39794 fix.go:112] recreateIfNeeded on ha-333994: state=Stopped err=<nil>
	I0717 17:49:11.381920   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	W0717 17:49:11.382061   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:11.384353   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994" ...
	I0717 17:49:11.386332   39794 main.go:141] libmachine: (ha-333994) Calling .Start
	I0717 17:49:11.386525   39794 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:49:11.387295   39794 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:49:11.387605   39794 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:49:11.387902   39794 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:49:11.388700   39794 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:49:12.581316   39794 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:49:12.582199   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.582613   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.582685   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.582591   39823 retry.go:31] will retry after 292.960023ms: waiting for machine to come up
	I0717 17:49:12.877268   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.877833   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.877861   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.877756   39823 retry.go:31] will retry after 283.500887ms: waiting for machine to come up
	I0717 17:49:13.163417   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.163805   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.163826   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.163761   39823 retry.go:31] will retry after 385.368306ms: waiting for machine to come up
	I0717 17:49:13.550406   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.550840   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.550897   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.550822   39823 retry.go:31] will retry after 528.571293ms: waiting for machine to come up
	I0717 17:49:14.080602   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.081093   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.081118   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.081048   39823 retry.go:31] will retry after 736.772802ms: waiting for machine to come up
	I0717 17:49:14.818924   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.819326   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.819347   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.819281   39823 retry.go:31] will retry after 776.986347ms: waiting for machine to come up
	I0717 17:49:15.598237   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:15.598607   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:15.598627   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:15.598573   39823 retry.go:31] will retry after 1.036578969s: waiting for machine to come up
	I0717 17:49:16.637046   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:16.637440   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:16.637463   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:16.637404   39823 retry.go:31] will retry after 1.055320187s: waiting for machine to come up
	I0717 17:49:17.694838   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:17.695248   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:17.695273   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:17.695211   39823 retry.go:31] will retry after 1.335817707s: waiting for machine to come up
	I0717 17:49:19.032835   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:19.033306   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:19.033330   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:19.033266   39823 retry.go:31] will retry after 1.730964136s: waiting for machine to come up
	I0717 17:49:20.766254   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:20.766740   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:20.766768   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:20.766694   39823 retry.go:31] will retry after 2.796619276s: waiting for machine to come up
	I0717 17:49:23.566195   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:23.566759   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:23.566784   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:23.566716   39823 retry.go:31] will retry after 3.008483388s: waiting for machine to come up
	I0717 17:49:26.576866   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:26.577295   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:26.577318   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:26.577242   39823 retry.go:31] will retry after 2.889284576s: waiting for machine to come up
	I0717 17:49:29.467942   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468316   39794 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:49:29.468337   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468346   39794 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:49:29.468737   39794 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:49:29.468757   39794 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:49:29.468777   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.468804   39794 main.go:141] libmachine: (ha-333994) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"}
	I0717 17:49:29.468820   39794 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:49:29.470695   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471026   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.471058   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471199   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:49:29.471226   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:49:29.471255   39794 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:29.471268   39794 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:49:29.471282   39794 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:49:29.598374   39794 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:29.598754   39794 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:49:29.599414   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.601913   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602312   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.602351   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602634   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:29.602858   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:29.602888   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:29.603106   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.605092   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605423   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.605446   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605613   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.605754   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.605900   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.606023   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.606203   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.606385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.606396   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:29.714755   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:29.714801   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715040   39794 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:49:29.715065   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715237   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.717642   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.717930   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.717959   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.718110   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.718285   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718413   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718528   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.718679   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.718838   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.718848   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:49:29.840069   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:49:29.840100   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.842822   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843208   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.843233   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843392   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.843581   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843706   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843878   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.844054   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.844256   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.844272   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:29.959423   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:29.959450   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:29.959474   39794 buildroot.go:174] setting up certificates
	I0717 17:49:29.959488   39794 provision.go:84] configureAuth start
	I0717 17:49:29.959495   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.959790   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.962162   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962537   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.962563   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.964777   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965084   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.965116   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965226   39794 provision.go:143] copyHostCerts
	I0717 17:49:29.965266   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965305   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:29.965317   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965397   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:29.965507   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965534   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:29.965544   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965581   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:29.965639   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965671   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:29.965680   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965714   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:29.965774   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:49:30.057325   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:30.057377   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:30.057400   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.059825   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060114   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.060140   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060281   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.060451   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.060561   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.060675   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.146227   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:30.146289   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:30.174390   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:30.174450   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:30.202477   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:30.202541   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0717 17:49:30.229907   39794 provision.go:87] duration metric: took 270.408982ms to configureAuth
	I0717 17:49:30.229929   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:30.230164   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:30.230177   39794 machine.go:97] duration metric: took 627.307249ms to provisionDockerMachine
	I0717 17:49:30.230186   39794 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:49:30.230200   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:30.230227   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.230520   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:30.230554   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.233026   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233363   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.233390   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233521   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.233700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.233828   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.233952   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.318669   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:30.323112   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:30.323131   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:30.323180   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:30.323246   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:30.323258   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:30.323348   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:30.334564   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:30.360407   39794 start.go:296] duration metric: took 130.206138ms for postStartSetup
	I0717 17:49:30.360441   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.360727   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:30.360774   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.362968   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363308   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.363334   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363435   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.363609   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.363749   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.363862   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.448825   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:30.448901   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:30.490930   39794 fix.go:56] duration metric: took 19.126931057s for fixHost
	I0717 17:49:30.490966   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.493716   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494056   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.494081   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494261   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.494473   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494636   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494816   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.495007   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:30.495221   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:30.495236   39794 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:49:30.611220   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238570.579395854
	
	I0717 17:49:30.611243   39794 fix.go:216] guest clock: 1721238570.579395854
	I0717 17:49:30.611255   39794 fix.go:229] Guest: 2024-07-17 17:49:30.579395854 +0000 UTC Remote: 2024-07-17 17:49:30.49095133 +0000 UTC m=+19.250883626 (delta=88.444524ms)
	I0717 17:49:30.611271   39794 fix.go:200] guest clock delta is within tolerance: 88.444524ms
	I0717 17:49:30.611277   39794 start.go:83] releasing machines lock for "ha-333994", held for 19.24729888s
	I0717 17:49:30.611293   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.611569   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:30.613990   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614318   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.614355   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614483   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.614909   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615067   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615169   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:30.615215   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.615255   39794 ssh_runner.go:195] Run: cat /version.json
	I0717 17:49:30.615275   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.617353   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617676   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.617702   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617734   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617863   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618049   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618146   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.618173   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.618217   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618306   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618370   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.618445   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618555   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618672   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.694919   39794 ssh_runner.go:195] Run: systemctl --version
	I0717 17:49:30.721823   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:49:30.727892   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:30.727967   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:30.745249   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:30.745272   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:30.745332   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:30.784101   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:30.798192   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:30.798265   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:30.811458   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:30.824815   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:30.938731   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:31.081893   39794 docker.go:233] disabling docker service ...
	I0717 17:49:31.081980   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:31.097028   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:31.110328   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:31.242915   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:31.365050   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:31.379135   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:31.400136   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:31.412561   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:31.425082   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:31.425159   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:31.437830   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.450453   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:31.462175   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.473289   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:31.484541   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:31.495502   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:31.506265   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:31.518840   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:31.530158   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:31.530208   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:31.548502   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:31.563431   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:31.674043   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:31.701907   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:31.702006   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:31.706668   39794 retry.go:31] will retry after 920.793788ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:32.627794   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:32.632953   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:32.633009   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:32.636846   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:32.677947   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:32.678013   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.709490   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.738106   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:32.739529   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:32.742040   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742375   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:32.742405   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742590   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:32.746706   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.759433   39794 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingre
ss:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:do
cker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:49:32.759609   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:32.759661   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.792410   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.792432   39794 containerd.go:534] Images already preloaded, skipping extraction
	I0717 17:49:32.792483   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.824536   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.824558   39794 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:49:32.824565   39794 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:49:32.824675   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:32.824722   39794 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:49:32.856864   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:32.856886   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:32.856893   39794 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:49:32.856917   39794 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:49:32.857032   39794 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:49:32.857054   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:32.857090   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:32.875326   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:32.875456   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:32.875511   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:32.885386   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:32.885459   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:49:32.895011   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:49:32.913107   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:32.929923   39794 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:49:32.946336   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:32.962757   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:32.966796   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.979550   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:33.092357   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:33.111897   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:49:33.111921   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:33.111940   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.112113   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:33.112206   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:33.112225   39794 certs.go:256] generating profile certs ...
	I0717 17:49:33.112347   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:33.112383   39794 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1
	I0717 17:49:33.112401   39794 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:49:33.337392   39794 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 ...
	I0717 17:49:33.337432   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1: {Name:mkfeb2a5adc7d732ca48854394be4077f3b9b81e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337612   39794 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 ...
	I0717 17:49:33.337630   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1: {Name:mk17811291d2c587100f8fbd5f0c9c2d641ddf76 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337728   39794 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:49:33.337924   39794 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:49:33.338098   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:33.338134   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:33.338154   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:33.338172   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:33.338188   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:33.338203   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:33.338221   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:33.338239   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:33.338253   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:33.338313   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:33.338354   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:33.338363   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:33.338391   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:33.338431   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:33.338457   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:33.338511   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:33.338549   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.338570   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.338587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.339107   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:33.371116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:33.405873   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:33.442007   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:33.472442   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:33.496116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:33.527403   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:33.552684   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:33.576430   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:33.599936   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:33.623341   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:33.646635   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:49:33.663325   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:33.668872   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:33.679471   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683810   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683866   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.689677   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:33.700471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:33.710911   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715522   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715581   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.721331   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:33.731730   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:33.742074   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746374   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746417   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.751941   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:33.762070   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:33.766344   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0717 17:49:33.771976   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0717 17:49:33.777506   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0717 17:49:33.783203   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0717 17:49:33.788713   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0717 17:49:33.794346   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0717 17:49:33.800031   39794 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:
false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docke
r BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:33.800131   39794 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:49:33.800172   39794 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:49:33.836926   39794 cri.go:89] found id: "86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	I0717 17:49:33.836947   39794 cri.go:89] found id: "dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f"
	I0717 17:49:33.836952   39794 cri.go:89] found id: "5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a"
	I0717 17:49:33.836956   39794 cri.go:89] found id: "f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428"
	I0717 17:49:33.836959   39794 cri.go:89] found id: "0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45"
	I0717 17:49:33.836963   39794 cri.go:89] found id: "2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	I0717 17:49:33.836967   39794 cri.go:89] found id: "d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411"
	I0717 17:49:33.836970   39794 cri.go:89] found id: "2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c"
	I0717 17:49:33.836974   39794 cri.go:89] found id: "5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46"
	I0717 17:49:33.836981   39794 cri.go:89] found id: "515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697"
	I0717 17:49:33.836985   39794 cri.go:89] found id: ""
	I0717 17:49:33.837036   39794 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0717 17:49:33.850888   39794 cri.go:116] JSON = null
	W0717 17:49:33.850933   39794 kubeadm.go:399] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0717 17:49:33.851001   39794 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:49:33.861146   39794 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0717 17:49:33.861164   39794 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0717 17:49:33.861204   39794 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0717 17:49:33.870180   39794 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:49:33.870557   39794 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-333994" does not appear in /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.870654   39794 kubeconfig.go:62] /home/jenkins/minikube-integration/19283-14409/kubeconfig needs updating (will repair): [kubeconfig missing "ha-333994" cluster setting kubeconfig missing "ha-333994" context setting]
	I0717 17:49:33.870894   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.871258   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.871471   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.180:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:49:33.871875   39794 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:49:33.872033   39794 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0717 17:49:33.881089   39794 kubeadm.go:630] The running cluster does not require reconfiguration: 192.168.39.180
	I0717 17:49:33.881107   39794 kubeadm.go:597] duration metric: took 19.938705ms to restartPrimaryControlPlane
	I0717 17:49:33.881113   39794 kubeadm.go:394] duration metric: took 81.089134ms to StartCluster
	I0717 17:49:33.881124   39794 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881175   39794 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.881658   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881845   39794 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:33.881872   39794 start.go:241] waiting for startup goroutines ...
	I0717 17:49:33.881879   39794 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:49:33.882084   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.884129   39794 out.go:177] * Enabled addons: 
	I0717 17:49:33.885737   39794 addons.go:510] duration metric: took 3.853682ms for enable addons: enabled=[]
	I0717 17:49:33.885760   39794 start.go:246] waiting for cluster config update ...
	I0717 17:49:33.885767   39794 start.go:255] writing updated cluster config ...
	I0717 17:49:33.887338   39794 out.go:177] 
	I0717 17:49:33.888767   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.888845   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.890338   39794 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:49:33.891461   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:33.891475   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:33.891543   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:33.891554   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:33.891626   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.891771   39794 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:33.891806   39794 start.go:364] duration metric: took 19.128µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:49:33.891819   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:33.891826   39794 fix.go:54] fixHost starting: m02
	I0717 17:49:33.892056   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:33.892076   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:33.906264   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44047
	I0717 17:49:33.906599   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:33.907064   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:33.907083   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:33.907400   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:33.907566   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:33.907713   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:49:33.909180   39794 fix.go:112] recreateIfNeeded on ha-333994-m02: state=Stopped err=<nil>
	I0717 17:49:33.909199   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	W0717 17:49:33.909338   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:33.911077   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994-m02" ...
	I0717 17:49:33.912122   39794 main.go:141] libmachine: (ha-333994-m02) Calling .Start
	I0717 17:49:33.912246   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:49:33.912879   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:49:33.913156   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:49:33.913539   39794 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:49:33.914190   39794 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:49:35.092192   39794 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:49:35.092951   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.093269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.093360   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.093273   39957 retry.go:31] will retry after 192.383731ms: waiting for machine to come up
	I0717 17:49:35.287679   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.288078   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.288104   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.288046   39957 retry.go:31] will retry after 385.654698ms: waiting for machine to come up
	I0717 17:49:35.675666   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.676036   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.676064   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.675991   39957 retry.go:31] will retry after 420.16772ms: waiting for machine to come up
	I0717 17:49:36.097264   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.097632   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.097689   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.097608   39957 retry.go:31] will retry after 593.383084ms: waiting for machine to come up
	I0717 17:49:36.692388   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.692779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.692805   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.692748   39957 retry.go:31] will retry after 522.894623ms: waiting for machine to come up
	I0717 17:49:37.217539   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.217939   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.217974   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.217901   39957 retry.go:31] will retry after 618.384823ms: waiting for machine to come up
	I0717 17:49:37.837779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.838175   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.838200   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.838142   39957 retry.go:31] will retry after 1.091652031s: waiting for machine to come up
	I0717 17:49:38.931763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:38.932219   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:38.932247   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:38.932134   39957 retry.go:31] will retry after 1.341674427s: waiting for machine to come up
	I0717 17:49:40.275320   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:40.275792   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:40.275820   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:40.275754   39957 retry.go:31] will retry after 1.293235927s: waiting for machine to come up
	I0717 17:49:41.571340   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:41.571705   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:41.571732   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:41.571661   39957 retry.go:31] will retry after 1.542371167s: waiting for machine to come up
	I0717 17:49:43.115333   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:43.115796   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:43.115826   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:43.115760   39957 retry.go:31] will retry after 1.886589943s: waiting for machine to come up
	I0717 17:49:45.004358   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:45.004727   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:45.004763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:45.004693   39957 retry.go:31] will retry after 2.72551249s: waiting for machine to come up
	I0717 17:49:47.733475   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:47.733874   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:47.733902   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:47.733829   39957 retry.go:31] will retry after 3.239443396s: waiting for machine to come up
	I0717 17:49:50.975432   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.975912   39794 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:49:50.975930   39794 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:49:50.975960   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.976436   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.976461   39794 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:49:50.976480   39794 main.go:141] libmachine: (ha-333994-m02) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"}
	I0717 17:49:50.976499   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:49:50.976514   39794 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:49:50.978829   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979226   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.979246   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979387   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:49:50.979411   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:49:50.979431   39794 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:50.979444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:49:50.979455   39794 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:49:51.106070   39794 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:51.106413   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:49:51.106973   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.109287   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.109618   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109826   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:51.110023   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:51.110040   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.110237   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.112084   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112321   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.112346   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112436   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.112578   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112724   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112869   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.113027   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.113194   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.113205   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:51.214365   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:51.214388   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214600   39794 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:49:51.214629   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.217146   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217465   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.217489   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217600   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.217758   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.217934   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.218049   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.218223   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.218385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.218401   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:49:51.334279   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:49:51.334317   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.337581   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.337905   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.337933   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.338139   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.338346   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338512   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338693   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.338845   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.339025   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.339046   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:51.454925   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:51.454956   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:51.454978   39794 buildroot.go:174] setting up certificates
	I0717 17:49:51.454987   39794 provision.go:84] configureAuth start
	I0717 17:49:51.454999   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.455257   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.457564   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.457851   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.457873   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.458013   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.459810   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460165   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.460190   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460306   39794 provision.go:143] copyHostCerts
	I0717 17:49:51.460327   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460352   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:51.460360   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460411   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:51.460474   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460493   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:51.460497   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460514   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:51.460556   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460571   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:51.460577   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460593   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:51.460641   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:49:51.635236   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:51.635286   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:51.635308   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.638002   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638369   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.638395   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638622   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.638815   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.638982   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.639145   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.720405   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:51.720478   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:51.746352   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:51.746412   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:49:51.770628   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:51.770702   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:51.795258   39794 provision.go:87] duration metric: took 340.256082ms to configureAuth
	I0717 17:49:51.795284   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:51.795490   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:51.795501   39794 machine.go:97] duration metric: took 685.467301ms to provisionDockerMachine
	I0717 17:49:51.795514   39794 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:49:51.795528   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:51.795563   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.795850   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:51.795874   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.798310   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798696   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.798719   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798889   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.799047   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.799191   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.799286   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.881403   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:51.885516   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:51.885542   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:51.885603   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:51.885687   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:51.885697   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:51.885773   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:51.894953   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:51.919442   39794 start.go:296] duration metric: took 123.913575ms for postStartSetup
	I0717 17:49:51.919487   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.919775   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:51.919801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.922159   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922506   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.922533   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922672   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.922878   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.923036   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.923152   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.004408   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:52.004481   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:52.063014   39794 fix.go:56] duration metric: took 18.171175537s for fixHost
	I0717 17:49:52.063061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.065858   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066239   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.066269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066459   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.066648   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066806   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066931   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.067086   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:52.067288   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:52.067303   39794 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:49:52.166802   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238592.140235525
	
	I0717 17:49:52.166826   39794 fix.go:216] guest clock: 1721238592.140235525
	I0717 17:49:52.166835   39794 fix.go:229] Guest: 2024-07-17 17:49:52.140235525 +0000 UTC Remote: 2024-07-17 17:49:52.063042834 +0000 UTC m=+40.822975139 (delta=77.192691ms)
	I0717 17:49:52.166849   39794 fix.go:200] guest clock delta is within tolerance: 77.192691ms
	I0717 17:49:52.166853   39794 start.go:83] releasing machines lock for "ha-333994-m02", held for 18.275039229s
	I0717 17:49:52.166873   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.167105   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:52.169592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.169924   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.169948   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.172181   39794 out.go:177] * Found network options:
	I0717 17:49:52.173607   39794 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:49:52.174972   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.175003   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175597   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175781   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175858   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:52.175897   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:49:52.175951   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.176007   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:49:52.176024   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.178643   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.178748   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179072   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179098   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179230   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179248   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179272   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179432   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179524   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179596   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179664   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179721   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179794   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.179844   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:49:52.256371   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:52.256433   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:52.287825   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:52.287848   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:52.287901   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:52.316497   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:52.330140   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:52.330189   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:52.343721   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:52.357273   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:52.483050   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:52.682504   39794 docker.go:233] disabling docker service ...
	I0717 17:49:52.682571   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:52.702383   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:52.717022   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:52.851857   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:52.989407   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:53.003913   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:53.024876   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:53.035470   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:53.046129   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:53.046184   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:53.056553   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.067211   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:53.077626   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.088680   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:53.100371   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:53.111920   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:53.123072   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:53.133713   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:53.143333   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:53.143405   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:53.157890   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:53.167934   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:53.302893   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:53.333425   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:53.333488   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:53.339060   39794 retry.go:31] will retry after 1.096332725s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:54.435963   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:54.441531   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:54.441599   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:54.445786   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:54.483822   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:54.483877   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.518845   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.553079   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:54.554649   39794 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:49:54.556061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:54.559046   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559422   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:54.559444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559695   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:54.564470   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:54.579269   39794 mustload.go:65] Loading cluster: ha-333994
	I0717 17:49:54.579483   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:54.579765   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.579792   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.594439   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39001
	I0717 17:49:54.594883   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.595350   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.595374   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.595675   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.595858   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:54.597564   39794 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:49:54.597896   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.597921   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.613634   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34405
	I0717 17:49:54.614031   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.614493   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.614511   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.614816   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.615002   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:54.615153   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:49:54.615165   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:54.615183   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:54.615314   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:54.615354   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:54.615363   39794 certs.go:256] generating profile certs ...
	I0717 17:49:54.615452   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:54.615493   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:49:54.615524   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:54.615535   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:54.615548   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:54.615560   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:54.615575   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:54.615587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:54.615599   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:54.615635   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:54.615651   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:54.615692   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:54.615716   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:54.615731   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:54.615754   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:54.615774   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:54.615795   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:54.615829   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:54.615854   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:54.615866   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:54.615877   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:54.615902   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:54.618791   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619169   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:54.619191   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619351   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:54.619524   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:54.619660   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:54.619789   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:54.694549   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:49:54.699693   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:49:54.711136   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:49:54.715759   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:49:54.727707   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:49:54.732038   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:49:54.743206   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:49:54.747536   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:49:54.759182   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:49:54.763279   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:49:54.774195   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:49:54.778345   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:49:54.790000   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:54.817482   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:54.842528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:54.867521   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:54.893528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:54.920674   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:54.946673   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:54.972385   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:54.997675   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:55.023298   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:55.048552   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:55.073345   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:49:55.091193   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:49:55.108383   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:49:55.125529   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:49:55.142804   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:49:55.160482   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:49:55.178995   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:49:55.197026   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:55.202998   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:55.214662   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219373   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219447   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.225441   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:55.236543   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:55.247672   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252336   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252396   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.258207   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:55.269215   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:55.280136   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284763   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284843   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.290471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:55.301174   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:55.305201   39794 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:49:55.305253   39794 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:49:55.305343   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:55.305377   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:55.305412   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:55.322820   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:55.322885   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:55.322938   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:55.332945   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:55.333009   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0717 17:49:55.342555   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0717 17:49:55.358883   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:55.375071   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:55.393413   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:55.397331   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:55.411805   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.535806   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:55.554620   39794 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:55.554913   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:55.556751   39794 out.go:177] * Verifying Kubernetes components...
	I0717 17:49:55.558066   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.748334   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:56.613699   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:56.613920   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0717 17:49:56.613970   39794 kubeadm.go:483] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.180:8443
	I0717 17:49:56.614170   39794 node_ready.go:35] waiting up to 6m0s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:49:56.614265   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:56.614272   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:56.614280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:56.614286   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:56.627325   39794 round_trippers.go:574] Response Status: 404 Not Found in 13 milliseconds
	I0717 17:49:57.115057   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.115083   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.115091   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.115095   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.117582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:57.614333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.614354   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.614362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.614365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.616581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.117636   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.615397   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.615423   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.615434   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.617780   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.617919   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:49:59.114753   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.114774   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.116989   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:59.615261   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.615289   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.615299   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.615305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.617539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.115327   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.115348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.115356   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.115359   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.117595   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.615335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.615371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:01.115332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.115360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.115364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.118462   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:01.118555   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:01.614396   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.614425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.614429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.616688   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.115381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.115413   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.115424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.115429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.117845   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.614519   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.616973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.114666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.114690   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.114706   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.114711   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.614478   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.614500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.614508   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.616763   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.616861   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:04.115079   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.115103   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.115116   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.117400   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:04.614899   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.614932   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.614936   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.617138   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.115001   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.115024   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.115031   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.115039   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.117375   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.615153   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.615158   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.617472   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.617581   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:06.115206   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.115235   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.115240   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.117694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:06.614430   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.614453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.614462   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.614467   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.616849   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.115357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.115378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.115386   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.117909   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.614460   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.614497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.617064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.115383   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.115405   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.115412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.115417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.117848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.117947   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:08.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.614415   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.614423   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.616608   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.114929   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.114950   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.114958   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.114962   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.117409   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.614639   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.614659   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.614666   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.614670   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.616904   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.114644   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.114668   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.114676   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.114685   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.117224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.614973   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.614995   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.615003   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.615007   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.617474   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:11.115160   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.115187   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.115197   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.117916   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:11.615031   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.615053   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.615061   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.615065   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.617581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.115275   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.115297   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.115305   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.615329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.615367   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.617808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.617929   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:13.114465   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.114488   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.114497   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.114501   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.116973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:13.614674   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.614704   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.614715   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.614721   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.617161   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.115360   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.117798   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.615028   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.615052   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.615062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.615068   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.617174   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.115117   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.115140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.115149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.115154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.117832   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.117958   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:15.614474   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.614528   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.614534   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.114493   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.114529   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.114536   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.114540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.117140   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.614895   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.614935   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.614943   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.617847   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.114480   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.114500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.114510   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.116841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.614484   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.614505   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.614512   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.614515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.616877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.617049   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:18.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.115346   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.115354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.115358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.117690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:18.614346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.614364   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.614372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.614377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.617203   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.114315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.114349   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.114357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.114362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.119328   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:19.614516   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.614536   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.614544   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.614549   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.616974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.617173   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:20.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.114896   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.114905   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.114908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.117228   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:20.614953   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.614974   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.614981   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.614987   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.617553   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.115256   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.115288   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.115297   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.117516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.614470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.614493   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.614504   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.616801   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.114458   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.114481   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.114491   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.114497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.116704   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.116814   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:22.614361   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.614383   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.614395   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.616868   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.117765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.614469   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.614480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.614486   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.616902   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.115254   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.115277   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.115287   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.115292   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.117319   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.117422   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:24.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.614655   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.614665   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.614669   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.617182   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:25.115401   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.115430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.115434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.118835   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:25.614325   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.614351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.614366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.616764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.114413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.114451   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.114460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.117000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.614789   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.614815   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.614826   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.614831   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.617192   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.617279   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:27.114863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.114888   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.114897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.114903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.117792   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:27.615352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.615378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.615389   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.615394   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.618057   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.115330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.115353   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.115365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.117820   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.615355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.615377   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.615385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.615389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.619637   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:28.619765   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:29.114706   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.114727   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.114734   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.114738   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.117064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:29.614803   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.614826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.614835   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.617436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.114527   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.114565   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.614542   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.614551   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.614554   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.114819   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.114856   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.114867   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.114873   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.117237   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.117345   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:31.615179   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.615203   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.615219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.615224   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.617525   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.115329   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.115337   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.115341   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.117639   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.614391   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.614403   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.617172   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.115127   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.115162   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.117796   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.117911   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:33.614544   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.614586   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.614597   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.614611   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.616706   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.115175   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.115197   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.117345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.614352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.614380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.614384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.616826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.114840   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.114867   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.114876   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.114881   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.117298   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.615140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.617897   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:36.115372   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.115393   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.115402   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.115405   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.117735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:36.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.615376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.615388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.617891   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:37.114533   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.114567   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.114572   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:37.615384   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.615406   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.615414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.615417   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.617760   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.114425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.114448   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.114455   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.114458   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.117016   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.117135   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:38.614755   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.614779   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.614787   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.614790   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.617099   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.115282   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.115303   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.115311   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.115315   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.117895   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.614832   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.614853   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.614865   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.617355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.115339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.115361   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.115369   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.117661   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:40.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.614396   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.616881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.114581   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.114606   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.114616   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.114622   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.116877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.614884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.614906   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.614914   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.614919   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.115181   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.115193   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.115201   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.117819   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:42.614328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.614348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.614356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.617382   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:43.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.115127   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.115135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.115140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.117355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:43.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.615142   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.617549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.114826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.114834   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.114839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.117204   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.615431   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.615439   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.617856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.617969   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:45.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.115093   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.117220   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:45.614988   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.615008   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.615015   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.615018   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.617421   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.115178   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.615076   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.617407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.117871   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.117975   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:47.614555   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.614577   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.614586   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.617103   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.114743   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.116997   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.614683   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.614710   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.614721   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.614734   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.617185   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.115307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.115332   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.115343   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.115347   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.117646   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.614838   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.614858   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.614872   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.614880   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.617342   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.617440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:50.115333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.115372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.117536   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:50.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.615270   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.615278   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.615282   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.617747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.114366   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.114389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.114396   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.114400   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.116597   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.614397   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.614401   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.616747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.114431   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.114453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.114461   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.117470   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:52.615088   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.615111   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.615118   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.615122   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.617416   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.115203   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.115208   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.117683   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.614356   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.614376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.614384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.614388   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.616703   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.114990   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.115013   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.115020   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.115024   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.117855   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.117941   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:54.615104   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.615125   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.615135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.615140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.114983   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.115012   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.117396   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.615131   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.615152   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.615168   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.617453   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.115180   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.115201   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.115209   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.117326   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.615051   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.615074   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.615082   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.615087   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.617369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.617480   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:57.115080   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.115102   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.115110   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.115114   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.117510   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:57.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.615246   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.615254   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.615258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.617511   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.114811   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.117265   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.614995   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.615015   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.615023   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.615028   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.617145   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.115342   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.115350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.115353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.117772   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.117893   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:59.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.614906   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.617194   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.115270   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.115304   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.117653   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.615391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.617720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.114385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.114413   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.114416   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.116717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.614708   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.614735   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.614745   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.614751   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.617211   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.617309   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:02.114916   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.114948   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.114956   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.114965   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.117244   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:02.614964   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.614987   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.614995   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.614999   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.617512   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.115239   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.115251   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.117907   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.614525   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.614547   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.614557   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.614561   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.621322   39794 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0717 17:51:03.621424   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:04.114491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.114513   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.116543   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:04.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.614699   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.614705   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.616831   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.114969   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.114996   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.115003   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.115008   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.117465   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.615208   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.615240   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.617689   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.114340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.114360   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.114368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.114372   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.116445   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.116590   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:06.615129   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.615165   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.615172   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.115324   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.117841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.614530   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.614557   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.614566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.614570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.617073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.114714   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.114750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.114756   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.117056   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.117161   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:08.615333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.615360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.617848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:09.114938   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.114965   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.114974   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.114980   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.118060   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:09.615157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.615177   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.615192   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.617894   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.115084   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.115104   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.115112   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.115120   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.117391   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.117508   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:10.615120   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.615155   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.615161   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.617842   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.114485   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.114507   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.114515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.114520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.117245   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.615400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.615426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.615437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.617790   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.115351   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.117803   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.117915   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:12.614461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.614485   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.614495   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.614500   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.617208   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.114980   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.115020   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.117385   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.615122   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.615160   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.615166   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.617805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.115212   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.115244   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.115253   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.115258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.117528   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.614681   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.614701   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.614711   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.614717   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.617113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.617211   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:15.115267   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.115291   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.115302   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.115309   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.117537   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:15.615307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.615331   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.615340   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.615345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.617660   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.115400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.115426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.115437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.115444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.118040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.614698   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.614703   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.617162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.617258   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:17.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.114853   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.114868   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.117547   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:17.615274   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.615316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.615323   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.617344   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.115064   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.115086   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.115097   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.115101   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.117232   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.614999   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.615021   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.615032   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.615037   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.617285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.617392   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:19.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.114451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.117257   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:19.615315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.615344   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.617155   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:20.115264   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.115284   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.115292   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.115296   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.117412   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:20.615133   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.615162   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.615165   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.616967   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:21.114603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.114639   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.114648   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.114655   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.116866   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:21.116957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:21.614816   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.614841   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.614850   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.614854   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.115139   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.115162   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.115170   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.115174   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.614412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.614434   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.614440   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.614444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.617178   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.114352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.114377   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.114388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.114392   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.116563   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.615345   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.615372   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.615380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.618002   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.618112   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:24.115378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.115401   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.115418   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.117758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:24.614891   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.614912   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.614926   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.617332   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.115412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.115436   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.115445   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.115448   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.117910   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.614339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.614363   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.614371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.614375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.617451   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:26.115183   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.115207   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.115219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.115225   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.117163   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:26.117274   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:26.614942   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.614966   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.614977   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.614984   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.617676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.115347   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.115370   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.115380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.117861   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.615350   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.615359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.618250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.114551   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.114569   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.114577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.114583   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.117333   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.117440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:28.615148   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.615180   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.615191   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.615196   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:29.114764   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.114789   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.114800   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.114804   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:29.615144   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.615168   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.615180   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.615195   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.114670   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.114678   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.116515   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:30.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.615265   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.615273   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.617998   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.618150   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:31.115373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.115395   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.115403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.115407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.117657   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:31.614754   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.614781   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.614789   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.616938   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.115334   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.115357   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.115370   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.117890   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.614529   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.614551   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.614559   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.614563   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.617063   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.114739   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.114762   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.114769   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.114773   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.116876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.116968   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:33.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.614566   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.614574   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.614579   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.616992   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.115382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.115403   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.115414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.117715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.614863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.614888   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.617243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.115352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.115375   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.117853   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.117957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:35.614511   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.614533   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.614541   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.614547   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.617000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.114661   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.114682   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.114690   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.114695   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.117055   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.614908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.617081   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.114747   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.117323   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.615075   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.617571   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.617677   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:38.115271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.117337   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:38.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.615136   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.615143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.615146   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.617524   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.114717   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.114726   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.114731   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.116906   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.615059   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.615078   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.615086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.615090   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:40.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.114645   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.114655   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.114659   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.116637   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:40.116742   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:40.615346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.615368   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.615379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.615385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.617774   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.114470   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.114474   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.116924   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.614862   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.614882   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.614890   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.617121   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.114844   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.114871   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.114880   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.114887   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.117456   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.117549   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:42.615184   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.615219   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.615228   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.615231   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.617697   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.115377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.117888   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.614542   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.614564   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.614572   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.614575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.617156   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.114390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.114418   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.114430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.116806   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.614781   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.614808   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.614813   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.616969   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.617103   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:45.115008   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.115031   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.117431   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:45.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.615252   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.615262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.615266   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.617533   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.115209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.115230   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.115243   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.118193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.614898   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.614921   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.614928   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.614932   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.617234   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.617429   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:47.115009   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.115032   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.117484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:47.615213   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.615236   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.615245   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.615249   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.617602   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.115343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.117939   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.614599   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.614625   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.617112   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.117738   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.117854   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:49.614434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.614465   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.614475   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.614479   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.617641   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:50.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.115370   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.117407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:50.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.615340   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.615353   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.114398   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.114407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.114414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.116810   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.614799   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.614831   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.614844   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.617260   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.617398   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:52.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.115094   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.115102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.115108   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.117538   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:52.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.615365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.617834   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.114486   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.114512   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.118242   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:53.615003   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.615034   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.615045   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.615051   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.617826   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:54.115063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.115091   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.115100   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.115105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.117425   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:54.615271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.615304   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.615309   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.617987   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:55.115096   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.115119   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.115127   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.115131   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:55.614857   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.614897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.614903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.617711   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.115361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.118008   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.118139   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:56.614719   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.614745   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.614752   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.614756   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.617529   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.115310   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.115318   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.115321   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.117714   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.614495   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.614525   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.614528   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.616925   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.114573   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.114598   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.114609   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.114613   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.116783   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.614459   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.614476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.616956   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:59.115030   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.115055   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.115066   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.115073   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:59.615128   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.615151   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.615159   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.615164   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.617627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.114672   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.114694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.114702   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.114706   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.117073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.614975   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.614999   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.615009   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.615014   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.617143   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.617251   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:01.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.114842   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.114852   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.114858   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.117434   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:01.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.614440   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.614448   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.617018   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.114715   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.114722   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.114727   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.116963   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.614625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.614650   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.614660   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.614664   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.617042   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.114775   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.116932   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.117041   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:03.614597   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.614618   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.614630   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.616748   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.115018   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.115039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.115049   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.115053   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.117556   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.615368   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.617694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.114830   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.114857   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.114865   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.114869   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.117278   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.117380   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:05.615000   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.615035   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.615052   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.617339   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.115037   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.115056   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.115062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.115066   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.117588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.614309   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.614333   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.614341   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.614346   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.616516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.115312   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.115336   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.115345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.115349   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.117714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:07.615376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.615398   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.615406   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.617826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.114477   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.114499   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.114511   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.116889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.614611   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.614649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.115169   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.115191   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.117574   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.615328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.615357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.615361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.617889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.618007   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:10.115232   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.115254   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.115262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.115268   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.117721   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:10.614358   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.614381   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.614388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.616539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.115338   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.115377   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.115384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.117600   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.614501   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.614525   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.614535   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.614539   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.616883   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.114544   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.114552   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.114557   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.117075   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.117189   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:12.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.614866   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.617132   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.114797   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.114818   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.114830   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.114835   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.117193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.614859   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.614880   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.614887   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.614891   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.617224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.114680   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.114701   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.114708   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.114713   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.117640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:14.615371   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.615412   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.617899   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.115307   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.115316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.115320   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.615379   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.615407   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.617678   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.115368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.117508   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.615332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.615355   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.617762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.617852   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:17.115342   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.115380   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.117745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:17.614381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.614404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.614411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.614414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.616676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:18.114344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.114365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.114372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.114377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.116126   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:18.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.614859   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.614863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.617249   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.114382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.114404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.114422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.116549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.116667   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:19.615132   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.615157   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.615166   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.617897   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:20.115394   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.115438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.120626   39794 round_trippers.go:574] Response Status: 404 Not Found in 5 milliseconds
	I0717 17:52:20.615314   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.615343   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.617815   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.114476   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.114497   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.114509   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.114516   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.116694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.116789   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:21.614568   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.614590   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.614596   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.614600   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.616740   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.114465   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.114477   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.116620   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.615373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.615414   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.615422   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.615425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.115377   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.115390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.117793   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.117961   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:23.614462   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.614495   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.616758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.115153   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.115174   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.115183   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.115187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.117485   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.615251   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.615278   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.615294   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.618155   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.114648   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.114656   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.114660   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.117162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.614843   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.614863   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.614871   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.614875   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.617057   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:26.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.114665   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.114677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.116743   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:26.614490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.614512   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.614521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.614524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.616812   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.115366   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.115379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.117751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.614385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.614429   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.614436   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.614440   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.616766   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.114438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.114476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.116881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.116995   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:28.614550   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.614573   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.614583   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.616576   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:29.114665   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.114688   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.114697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.114701   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.116949   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:29.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.614647   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.614652   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.617229   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.114692   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.114711   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.114725   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.116453   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:30.615200   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.615233   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.615241   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.617947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.618078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:31.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.114663   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.114674   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.114677   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.116821   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:31.614807   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.614849   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.614857   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.617107   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.114733   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.114780   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.114784   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.117117   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.614873   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.614906   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.614913   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.617084   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.114774   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.116968   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.117056   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:33.614614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.614634   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.614642   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.614648   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.616694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.114989   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.115010   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.115019   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.115023   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.117256   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.615017   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.615039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.615049   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.617305   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.114707   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.114729   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.114737   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.114741   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.116837   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.617169   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.617264   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:36.114880   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.114903   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.114912   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.114915   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.117413   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:36.615154   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.615178   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.615189   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.617681   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.114404   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.114427   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.114438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.116709   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.614444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.614452   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.614465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.616814   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.114566   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.117012   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.117111   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:38.614715   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.614738   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.614746   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.614750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.115300   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.115321   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.115330   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.117647   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.615387   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.615412   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.615418   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.615422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.617840   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.114520   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.114548   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.114553   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.116874   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.614642   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.614667   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.614677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.614682   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.617201   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.617299   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:41.114884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.114913   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.114925   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.114930   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.117705   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:41.614760   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.614784   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.614799   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.617304   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.115055   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.115077   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.115086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.115092   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.117464   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.615207   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.617906   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:43.114443   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.114471   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.114484   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.114489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.116804   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:43.614503   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.614534   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.614546   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.616923   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.114333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.114362   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.114371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.114376   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.116593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.615353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.615375   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.615383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.615387   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.619020   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:44.619252   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:45.114535   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.114558   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.114565   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.114568   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.116805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:45.614455   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.614477   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.614485   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.614489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.616531   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.115327   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.115340   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.117430   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.615358   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.615364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.617638   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.115375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.115397   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.115405   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.115410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.117966   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.118069   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:47.614605   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.614627   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.614635   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.617373   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.115142   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.115164   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.115173   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.115177   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.117353   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.615075   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.615097   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.615105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.617317   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.114470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.114492   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.114501   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.114506   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.116813   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.617717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.617816   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:50.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.115376   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.115384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.115389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.117802   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:50.614440   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.614462   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.614474   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.616542   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:51.115295   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.115318   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.115325   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.115329   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.118739   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:51.614657   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.614694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.614703   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.614708   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.616892   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.114541   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.114568   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.114575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.114578   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.117054   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.117156   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:52.614718   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.614748   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.614759   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.614765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.114959   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.114984   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.114996   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.115000   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.117274   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.615035   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.615060   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.615070   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.617250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.114679   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.114686   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.114694   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.116952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.614585   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.614604   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.614612   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.614615   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.616959   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.617087   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:55.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.114543   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.114550   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.114556   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.117176   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:55.614804   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.614830   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.614843   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.614848   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.114710   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.114750   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.114757   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.117352   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.615042   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.615064   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.615072   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.617503   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.617629   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:57.115247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.115273   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.115283   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.115289   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.119157   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:57.614778   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.614808   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.614812   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.617771   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.114423   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.114444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.114451   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.114455   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.116940   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.614594   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.614616   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.614631   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.616901   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.114914   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.114934   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.114942   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.114945   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.117144   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.117235   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:59.614791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.614814   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.614822   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.614827   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.617115   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.115354   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.115366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.117649   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.615378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.615400   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.615411   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.615416   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.617719   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.114375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.114397   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.114404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.114408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.614966   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.614991   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.615002   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.615011   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.618973   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:01.619078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:02.114685   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.114710   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.114723   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:02.615258   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.615281   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.615293   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.115355   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.115371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.117667   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.615340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.615365   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.615374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.615379   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.617818   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.115204   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.115234   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.115238   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.117764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.117866   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:04.615339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.615357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.617952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.114451   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.114472   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.114480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.114484   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.116809   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.614454   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.614475   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.614482   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.614487   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.616856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.114549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.114553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.117433   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.615116   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.615137   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.615145   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.615149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.617328   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.617423   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:07.115073   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.115096   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.115109   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.117243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:07.614957   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.614980   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.614988   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.614992   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.617455   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.115203   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.115228   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.115237   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.115242   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.117953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.614601   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.614621   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.614627   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.614632   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.616977   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.115171   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.115192   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.115200   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.115204   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.117505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.117620   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:09.615237   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.615259   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.615266   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.615270   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.617567   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.115157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.115180   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.115188   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.115191   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.117490   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.615247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.615277   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.615280   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.618489   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.115353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.115382   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.118557   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.118654   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:11.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.614439   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.616736   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.114441   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.114467   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.114475   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.114479   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.615359   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.615390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.617471   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.115196   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.115221   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.115230   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.115235   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.117548   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.615239   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.615269   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.615279   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.615285   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.617765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.617868   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:14.115201   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.115222   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.115230   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.118205   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:14.614910   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.614930   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.614941   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.614946   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.617345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.114915   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.114940   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.114953   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.114959   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.117285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.615063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.615091   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.615102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.617892   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:16.114326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.114345   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.114353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.114358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.116687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:16.614425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.614445   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.614456   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.614463   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.115235   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.115266   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.115275   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.115281   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.117592   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.615370   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.615394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.615403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.115421   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.115449   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.115460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.115466   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.117540   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.117666   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:18.615244   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.615280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.615285   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.617069   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:19.115249   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.115272   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.115282   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.115288   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:19.614391   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.614427   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.614435   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.614439   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.616687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:20.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.115251   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.115255   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.119958   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:53:20.120050   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:20.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.614641   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.614651   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.617751   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:21.115329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.115350   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.118322   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:21.615343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.615364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.615373   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.615376   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.617662   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.114307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.114356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.114367   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.114373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.614436   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.614447   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.614452   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.616582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.616699   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:23.115301   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.115323   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.115331   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.115335   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.117744   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:23.614413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.614437   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.616559   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.115103   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.115133   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.115147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.117693   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.614569   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.614577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.614581   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.617065   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.617179   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:25.114461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.114485   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.114493   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.114496   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.116786   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:25.614416   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.614438   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.614446   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.616751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.114388   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.114410   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.114421   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.116745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.614603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.614626   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.617102   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.617207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:27.114783   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.114807   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.114818   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.114826   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.117702   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:27.614374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.614412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.614420   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.614425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.115250   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.115254   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.615342   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.615354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.617775   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.617869   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:29.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.114893   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.114901   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.114907   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:29.615278   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.615300   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.615308   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.615313   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.617690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.117881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.614571   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.614593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.614601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.614605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.617497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.115240   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.115252   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.117580   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.117691   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:31.614491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.614514   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.614520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.614525   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.616752   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.114434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.114457   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.114465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.114469   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.116843   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.614510   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.614531   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.614537   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.614540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.617151   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.114852   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.114859   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.117627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.117746   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:33.615336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.617473   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.114732   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.117561   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.615316   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.615341   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.615351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.615356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.618153   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.114569   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.114593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.114601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.114605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.116953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.614348   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.614383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.614389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.617237   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:36.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.114812   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.117593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:36.615382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.615417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.618040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.114722   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.114753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.114761   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.114765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.116947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.614643   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.614686   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.614697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.614702   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.616876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.114536   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.114566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.114570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.117369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.117462   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:38.615126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.615156   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.615160   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.115081   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.115113   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.115122   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.115126   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.117948   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.614647   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.614659   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.614665   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.617484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.115131   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.115149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.117287   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.615033   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.615059   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.615071   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.617572   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.617676   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:41.115286   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.115309   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.115316   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.115321   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.117762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:41.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.614734   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.614743   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.614747   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.617493   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.115269   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.115292   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.115303   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.117720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.614427   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.614434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.616931   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.115385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.115412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.115425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.118066   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.118207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:43.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.614753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.614765   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.614770   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.617067   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.114374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.114406   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.114415   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.114419   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.116619   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.615405   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.617626   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.115126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.115163   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.117350   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.615112   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.615135   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.615142   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.615147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.617618   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.617714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:46.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:46.615363   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.615386   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.615394   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.615398   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.617675   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.114336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.114357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.114365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.114369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.116450   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.615209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.615232   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.615248   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.617669   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.617889   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:48.114456   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.114479   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.114488   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.114491   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.116715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:48.614390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.614424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.616735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.114828   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.114850   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.114858   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.114863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.117111   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.614976   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.614997   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.615005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.615010   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.617505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.115004   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.115026   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.115033   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.115038   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.117441   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:50.615143   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.615170   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.615179   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.615187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.617427   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.115170   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.115193   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.115205   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.117289   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.615380   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.615419   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.618038   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.114724   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.114760   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.114773   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.117189   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.614887   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.614911   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.614927   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.617222   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.617335   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:53.114967   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.114994   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.115005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.115013   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.117578   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:53.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.614394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.614404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.614412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.617467   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:54.114883   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.114906   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.114915   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.114921   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.117603   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.615330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.615353   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.618101   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.618221   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:55.114614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.114640   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.114649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.114656   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.117436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:55.615236   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.615260   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.615270   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.617974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.114490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.114511   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.114524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.117090   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.614907   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.614932   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.614943   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.614948   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.618676   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:56.618791   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:56.618808   39794 node_ready.go:38] duration metric: took 4m0.004607374s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:53:56.620932   39794 out.go:177] 
	W0717 17:53:56.622268   39794 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0717 17:53:56.622282   39794 out.go:239] * 
	W0717 17:53:56.623241   39794 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:53:56.625101   39794 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	4c2118d2ed18a       6e38f40d628db       3 minutes ago       Running             storage-provisioner       2                   700d9f5e713d3       storage-provisioner
	dd5e8f56c4264       5cc3abe5717db       4 minutes ago       Running             kindnet-cni               1                   dbdf19f96898d       kindnet-5zksq
	b50ede0dde503       cbb01a7bd410d       4 minutes ago       Running             coredns                   1                   4c25cc8ac2148       coredns-7db6d8ff4d-n4xtd
	b27c10fa3251b       8c811b4aec35f       4 minutes ago       Running             busybox                   1                   c15a92e53e40d       busybox-fc5497c4f-5ngfp
	85983f98f84b9       cbb01a7bd410d       4 minutes ago       Running             coredns                   1                   507cc72648f25       coredns-7db6d8ff4d-sh96r
	603ad8840c526       6e38f40d628db       4 minutes ago       Exited              storage-provisioner       1                   700d9f5e713d3       storage-provisioner
	cede48d48fe27       53c535741fb44       4 minutes ago       Running             kube-proxy                1                   1b59105c6df2e       kube-proxy-jlzt5
	7f7ede089f3e7       7820c83aa1394       4 minutes ago       Running             kube-scheduler            1                   903065308cbb5       kube-scheduler-ha-333994
	38a3e6e69ce36       e874818b3caac       4 minutes ago       Running             kube-controller-manager   1                   bfcca696b5273       kube-controller-manager-ha-333994
	3c3e7888bdfe6       56ce0fd9fb532       4 minutes ago       Running             kube-apiserver            1                   2a8a2b0c39cd0       kube-apiserver-ha-333994
	41d1b53347d3e       3861cfcd7c04c       4 minutes ago       Running             etcd                      1                   7982d05a46241       etcd-ha-333994
	529be299dc3b8       38af8ddebf499       4 minutes ago       Running             kube-vip                  0                   fb62346baad47       kube-vip-ha-333994
	db107babf5b82       8c811b4aec35f       26 minutes ago      Exited              busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	dcb6f2bdfe23d       cbb01a7bd410d       27 minutes ago      Exited              coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       27 minutes ago      Exited              coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       27 minutes ago      Exited              kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       27 minutes ago      Exited              kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	d3a0374a88e2c       56ce0fd9fb532       27 minutes ago      Exited              kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       27 minutes ago      Exited              kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       27 minutes ago      Exited              etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       27 minutes ago      Exited              kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.464434972Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.673549472Z" level=info msg="RemoveContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.682188663Z" level=info msg="RemoveContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.314045705Z" level=info msg="RemoveContainer for \"2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.319121815Z" level=info msg="RemoveContainer for \"2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320511033Z" level=info msg="StopPodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320605313Z" level=info msg="TearDown network for sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320616460Z" level=info msg="StopPodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320971991Z" level=info msg="RemovePodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.321016823Z" level=info msg="Forcibly stopping sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.321072160Z" level=info msg="TearDown network for sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.325612741Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.325748048Z" level=info msg="RemovePodSandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326267222Z" level=info msg="StopPodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326463624Z" level=info msg="TearDown network for sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326510323Z" level=info msg="StopPodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326827690Z" level=info msg="RemovePodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326922590Z" level=info msg="Forcibly stopping sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326997124Z" level=info msg="TearDown network for sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.331124459Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.331204383Z" level=info msg="RemovePodSandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" returns successfully"
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.387511700Z" level=info msg="CreateContainer within sandbox \"700d9f5e713d3946ac2752599935acff0c22e7d5b1d38328f08b4514902b10af\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:2,}"
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.414846958Z" level=info msg="CreateContainer within sandbox \"700d9f5e713d3946ac2752599935acff0c22e7d5b1d38328f08b4514902b10af\" for &ContainerMetadata{Name:storage-provisioner,Attempt:2,} returns container id \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\""
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.415806226Z" level=info msg="StartContainer for \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\""
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.483461513Z" level=info msg="StartContainer for \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [85983f98f84b97a11a481548c17b6e998bfec291ea5b38640a0522d82a174e86] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:32930 - 39231 "HINFO IN 1138402013862295929.6773124709558145559. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.011527303s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[649992777]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.508) (total time: 30004ms):
	Trace[649992777]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (17:50:20.513)
	Trace[649992777]: [30.004346914s] [30.004346914s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[119638294]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.509) (total time: 30004ms):
	Trace[119638294]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (17:50:20.512)
	Trace[119638294]: [30.004435266s] [30.004435266s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1087831118]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.513) (total time: 30001ms):
	Trace[1087831118]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:50:20.514)
	Trace[1087831118]: [30.001558122s] [30.001558122s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [b50ede0dde50338ef9fddc834d572f0d265fdc75b3a6e0ffab0b3a090f0cfac9] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:35715 - 11457 "HINFO IN 3013652693694148412.8082718229865211359. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.009035708s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1696274823]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.643) (total time: 30002ms):
	Trace[1696274823]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:50:20.645)
	Trace[1696274823]: [30.002410627s] [30.002410627s] END
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[990945787]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.645) (total time: 30001ms):
	Trace[990945787]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:50:20.645)
	Trace[990945787]: [30.00126887s] [30.00126887s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1760112988]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.646) (total time: 30000ms):
	Trace[1760112988]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:50:20.646)
	Trace[1760112988]: [30.000893639s] [30.000893639s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:53:51 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    4c5a3bea-29ed-4c23-a2f3-16d92a2e967b
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     27m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     27m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         27m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      27m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m9s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 27m                    kube-proxy       
	  Normal  Starting                 4m7s                   kube-proxy       
	  Normal  Starting                 27m                    kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  27m (x4 over 27m)      kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    27m (x4 over 27m)      kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     27m (x3 over 27m)      kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  27m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientPID     27m                    kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeHasSufficientMemory  27m                    kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    27m                    kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeAllocatableEnforced  27m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 27m                    kubelet          Starting kubelet.
	  Normal  RegisteredNode           27m                    node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                27m                    kubelet          Node ha-333994 status is now: NodeReady
	  Normal  Starting                 4m25s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  4m25s (x8 over 4m25s)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m25s (x8 over 4m25s)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m25s (x7 over 4m25s)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  4m25s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           4m3s                   node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	
	
	Name:               ha-333994-m03
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_07_17T17_40_16_0700
	                    minikube.k8s.io/version=v1.33.1
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:40:15 +0000
	Taints:             node.kubernetes.io/unreachable:NoExecute
	                    node.kubernetes.io/unreachable:NoSchedule
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994-m03
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:46:02 +0000
	Conditions:
	  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason              Message
	  ----             ------    -----------------                 ------------------                ------              -------
	  MemoryPressure   Unknown   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:46:44 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  DiskPressure     Unknown   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:46:44 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  PIDPressure      Unknown   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:46:44 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  Ready            Unknown   Wed, 17 Jul 2024 17:45:53 +0000   Wed, 17 Jul 2024 17:46:44 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	Addresses:
	  InternalIP:  192.168.39.197
	  Hostname:    ha-333994-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 91a213a8eb09434f90fc54c32c57b24f
	  System UUID:                91a213a8-eb09-434f-90fc-54c32c57b24f
	  Boot ID:                    45ccee74-7f48-47d9-9195-b6f993074cc5
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                       CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                       ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-74lsp    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kindnet-24fc8              100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      13m
	  kube-system                 kube-proxy-xkkdj           0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 13m                kube-proxy       
	  Normal  NodeHasSufficientMemory  13m (x2 over 13m)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    13m (x2 over 13m)  kubelet          Node ha-333994-m03 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     13m (x2 over 13m)  kubelet          Node ha-333994-m03 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  13m                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           13m                node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	  Normal  NodeReady                13m                kubelet          Node ha-333994-m03 status is now: NodeReady
	  Normal  NodeNotReady             7m14s              node-controller  Node ha-333994-m03 status is now: NodeNotReady
	  Normal  RegisteredNode           4m3s               node-controller  Node ha-333994-m03 event: Registered Node ha-333994-m03 in Controller
	
	
	==> dmesg <==
	[Jul17 17:49] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050055] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040308] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.524310] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.354966] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +1.596488] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +7.929260] systemd-fstab-generator[758]: Ignoring "noauto" option for root device
	[  +0.058074] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.064860] systemd-fstab-generator[770]: Ignoring "noauto" option for root device
	[  +0.158074] systemd-fstab-generator[784]: Ignoring "noauto" option for root device
	[  +0.141409] systemd-fstab-generator[796]: Ignoring "noauto" option for root device
	[  +0.316481] systemd-fstab-generator[830]: Ignoring "noauto" option for root device
	[  +1.413303] systemd-fstab-generator[905]: Ignoring "noauto" option for root device
	[  +6.936615] kauditd_printk_skb: 197 callbacks suppressed
	[  +9.904333] kauditd_printk_skb: 40 callbacks suppressed
	[  +6.090710] kauditd_printk_skb: 81 callbacks suppressed
	
	
	==> etcd [41d1b53347d3ec95c0752a7b8006e52252561ffd6b0613e71f4c4d1a66d84cd1] <==
	{"level":"info","ts":"2024-07-17T17:49:40.746451Z","caller":"fileutil/purge.go:50","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/snap","suffix":"snap","max":5,"interval":"30s"}
	{"level":"info","ts":"2024-07-17T17:49:40.746545Z","caller":"fileutil/purge.go:50","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2024-07-17T17:49:40.747109Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 switched to configuration voters=(808613133158692504)"}
	{"level":"info","ts":"2024-07-17T17:49:40.74735Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","added-peer-id":"b38c55c42a3b698","added-peer-peer-urls":["https://192.168.39.180:2380"]}
	{"level":"info","ts":"2024-07-17T17:49:40.747698Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:49:40.747826Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:49:40.768847Z","caller":"embed/etcd.go:726","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2024-07-17T17:49:40.769611Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"b38c55c42a3b698","initial-advertise-peer-urls":["https://192.168.39.180:2380"],"listen-peer-urls":["https://192.168.39.180:2380"],"advertise-client-urls":["https://192.168.39.180:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.39.180:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-07-17T17:49:40.771975Z","caller":"embed/etcd.go:857","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-07-17T17:49:40.783644Z","caller":"embed/etcd.go:597","msg":"serving peer traffic","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:49:40.784432Z","caller":"embed/etcd.go:569","msg":"cmux::serve","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:49:42.218092Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.218153Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.21819Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.218203Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218304Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218487Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218517Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.221374Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:49:42.221719Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:49:42.224325Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:49:42.224772Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:49:42.240735Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:49:42.240792Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:49:42.251537Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:41:11.077099Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":1506}
	{"level":"info","ts":"2024-07-17T17:41:11.08271Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":1506,"took":"4.803656ms","hash":4135639207,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2002944,"current-db-size-in-use":"2.0 MB"}
	{"level":"info","ts":"2024-07-17T17:41:11.082934Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":4135639207,"revision":1506,"compact-revision":967}
	{"level":"info","ts":"2024-07-17T17:46:11.088545Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":2115}
	{"level":"info","ts":"2024-07-17T17:46:11.093763Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":2115,"took":"4.690419ms","hash":3040853481,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2105344,"current-db-size-in-use":"2.1 MB"}
	{"level":"info","ts":"2024-07-17T17:46:11.093935Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":3040853481,"revision":2115,"compact-revision":1506}
	
	
	==> kernel <==
	 17:53:58 up 4 min,  0 users,  load average: 0.06, 0.12, 0.06
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [dd5e8f56c4264ac3ce97606579dbb45bd1defa712cc5dfd7ef8601f279e53896] <==
	I0717 17:52:51.809796       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:01.817602       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:01.817647       1 main.go:303] handling current node
	I0717 17:53:01.817662       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:01.817667       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:11.814577       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:11.814652       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:11.814805       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:11.814813       1 main.go:303] handling current node
	I0717 17:53:21.810119       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:21.810198       1 main.go:303] handling current node
	I0717 17:53:21.810249       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:21.810274       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:31.817062       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:31.817224       1 main.go:303] handling current node
	I0717 17:53:31.817323       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:31.817345       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:41.816987       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:41.817045       1 main.go:303] handling current node
	I0717 17:53:41.817065       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:41.817072       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:51.809287       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:51.809360       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:51.810037       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:51.810079       1 main.go:303] handling current node
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:46:36.593294       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:46:46.594446       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:46:46.594495       1 main.go:303] handling current node
	I0717 17:46:46.594508       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:46:46.594516       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:46:56.593210       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:46:56.593351       1 main.go:303] handling current node
	I0717 17:46:56.593473       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:46:56.593496       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:06.593427       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:06.593567       1 main.go:303] handling current node
	I0717 17:47:06.593587       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:06.593593       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:16.603181       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:16.603262       1 main.go:303] handling current node
	I0717 17:47:16.603286       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:16.603292       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:26.593294       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:26.593479       1 main.go:303] handling current node
	I0717 17:47:26.593751       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:26.593932       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:36.593175       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:36.593213       1 main.go:303] handling current node
	I0717 17:47:36.593235       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:36.593240       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [3c3e7888bdfe65eb452a8b1911680c8ed68a5d49a41528c6544c9bdbad54463d] <==
	I0717 17:49:43.595082       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I0717 17:49:43.595111       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I0717 17:49:43.595140       1 crd_finalizer.go:266] Starting CRDFinalizer
	I0717 17:49:43.597000       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0717 17:49:43.597114       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0717 17:49:43.641418       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0717 17:49:43.648238       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0717 17:49:43.648665       1 policy_source.go:224] refreshing policies
	I0717 17:49:43.659841       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:49:43.676754       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0717 17:49:43.677085       1 shared_informer.go:320] Caches are synced for configmaps
	I0717 17:49:43.679683       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0717 17:49:43.679810       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0717 17:49:43.682669       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0717 17:49:43.686464       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0717 17:49:43.688086       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	E0717 17:49:43.689041       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0717 17:49:43.691390       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0717 17:49:43.692086       1 aggregator.go:165] initial CRD sync complete...
	I0717 17:49:43.692210       1 autoregister_controller.go:141] Starting autoregister controller
	I0717 17:49:43.692231       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0717 17:49:43.692323       1 cache.go:39] Caches are synced for autoregister controller
	I0717 17:49:44.589738       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:49:55.907406       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:49:56.140322       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [38a3e6e69ce36e4718f7597a891505e74d497b2ce82217fdebe3363666ea32f6] <==
	I0717 17:49:55.938816       1 shared_informer.go:320] Caches are synced for PV protection
	I0717 17:49:55.945742       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="27.053326ms"
	I0717 17:49:55.946148       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="322.684µs"
	I0717 17:49:55.948558       1 shared_informer.go:320] Caches are synced for job
	I0717 17:49:55.953819       1 shared_informer.go:320] Caches are synced for stateful set
	I0717 17:49:55.969497       1 shared_informer.go:320] Caches are synced for disruption
	I0717 17:49:55.969720       1 shared_informer.go:320] Caches are synced for daemon sets
	I0717 17:49:55.989955       1 shared_informer.go:320] Caches are synced for crt configmap
	I0717 17:49:55.995325       1 shared_informer.go:320] Caches are synced for taint
	I0717 17:49:55.995861       1 node_lifecycle_controller.go:1227] "Initializing eviction metric for zone" logger="node-lifecycle-controller" zone=""
	I0717 17:49:56.008684       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994"
	I0717 17:49:56.009020       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:49:56.009215       1 node_lifecycle_controller.go:1073] "Controller detected that zone is now in new state" logger="node-lifecycle-controller" zone="" newState="Normal"
	I0717 17:49:56.107028       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0717 17:49:56.125129       1 shared_informer.go:320] Caches are synced for HPA
	I0717 17:49:56.130984       1 shared_informer.go:320] Caches are synced for endpoint
	I0717 17:49:56.150989       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:49:56.160240       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:49:56.545417       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:49:56.545744       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0717 17:49:56.607585       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:50:29.652302       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="17.989423ms"
	I0717 17:50:29.652927       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="154.343µs"
	I0717 17:50:29.673006       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="10.432657ms"
	I0717 17:50:29.674427       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="35.074µs"
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	I0717 17:46:44.300951       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.533645ms"
	I0717 17:46:44.302036       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="47.71µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-proxy [cede48d48fe274c1e899c0bd8bea598571a7def0a52e5e2bade595ef4f553fef] <==
	I0717 17:49:50.697431       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:49:50.728033       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:49:50.773252       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:49:50.773306       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:49:50.773323       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:49:50.776016       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:49:50.776460       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:49:50.776490       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:49:50.778529       1 config.go:192] "Starting service config controller"
	I0717 17:49:50.778847       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:49:50.778963       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:49:50.779098       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:49:50.780341       1 config.go:319] "Starting node config controller"
	I0717 17:49:50.780372       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:49:50.880389       1 shared_informer.go:320] Caches are synced for service config
	I0717 17:49:50.880465       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:49:50.880915       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kube-scheduler [7f7ede089f3e73228764b3c542d044e8dfb371908879f2d014d0b3cb56b61a60] <==
	I0717 17:49:41.818392       1 serving.go:380] Generated self-signed cert in-memory
	I0717 17:49:43.698181       1 server.go:154] "Starting Kubernetes Scheduler" version="v1.30.2"
	I0717 17:49:43.698222       1 server.go:156] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:49:43.704731       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0717 17:49:43.704960       1 requestheader_controller.go:169] Starting RequestHeaderAuthRequestController
	I0717 17:49:43.705003       1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController
	I0717 17:49:43.705055       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0717 17:49:43.708667       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0717 17:49:43.708702       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0717 17:49:43.708715       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file"
	I0717 17:49:43.708721       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
	I0717 17:49:43.805438       1 shared_informer.go:320] Caches are synced for RequestHeaderAuthRequestController
	I0717 17:49:43.809697       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
	I0717 17:49:43.809823       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:50:20 ha-333994 kubelet[912]: I0717 17:50:20.667533     912 scope.go:117] "RemoveContainer" containerID="86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	Jul 17 17:50:20 ha-333994 kubelet[912]: I0717 17:50:20.668345     912 scope.go:117] "RemoveContainer" containerID="603ad8840c52684184d18957755dbefa293c0f1b45c847cd88296b580d9ac18f"
	Jul 17 17:50:20 ha-333994 kubelet[912]: E0717 17:50:20.668770     912 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(123c311b-67ed-42b2-ad53-cc59077dfbe7)\"" pod="kube-system/storage-provisioner" podUID="123c311b-67ed-42b2-ad53-cc59077dfbe7"
	Jul 17 17:50:33 ha-333994 kubelet[912]: I0717 17:50:33.312537     912 scope.go:117] "RemoveContainer" containerID="2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	Jul 17 17:50:33 ha-333994 kubelet[912]: E0717 17:50:33.409447     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:50:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:50:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:50:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:50:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:50:36 ha-333994 kubelet[912]: I0717 17:50:36.384656     912 scope.go:117] "RemoveContainer" containerID="603ad8840c52684184d18957755dbefa293c0f1b45c847cd88296b580d9ac18f"
	Jul 17 17:51:33 ha-333994 kubelet[912]: E0717 17:51:33.410923     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:51:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:51:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:51:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:51:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:52:33 ha-333994 kubelet[912]: E0717 17:52:33.411201     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:52:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:52:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:52:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:52:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:53:33 ha-333994 kubelet[912]: E0717 17:53:33.409498     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:53:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:53:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:53:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:53:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/RestartClusterKeepsNodes]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  4m15s                default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.
	  Warning  FailedScheduling  4m10s                default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.
	  Warning  FailedScheduling  16m (x3 over 26m)    default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  8m13s (x3 over 13m)  default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/RestartClusterKeepsNodes FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/RestartClusterKeepsNodes (473.38s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeleteSecondaryNode (10.53s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeleteSecondaryNode
ha_test.go:487: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 node delete m03 -v=7 --alsologtostderr
ha_test.go:487: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 node delete m03 -v=7 --alsologtostderr: (7.58029375s)
ha_test.go:493: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:493: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: exit status 2 (403.69648ms)

                                                
                                                
-- stdout --
	ha-333994
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-333994-m02
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:54:07.172713   41168 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:54:07.172945   41168 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:54:07.172954   41168 out.go:304] Setting ErrFile to fd 2...
	I0717 17:54:07.172958   41168 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:54:07.173134   41168 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:54:07.173283   41168 out.go:298] Setting JSON to false
	I0717 17:54:07.173312   41168 mustload.go:65] Loading cluster: ha-333994
	I0717 17:54:07.173415   41168 notify.go:220] Checking for updates...
	I0717 17:54:07.173625   41168 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:54:07.173637   41168 status.go:255] checking status of ha-333994 ...
	I0717 17:54:07.174008   41168 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:07.174053   41168 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:07.189595   41168 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45481
	I0717 17:54:07.190065   41168 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:07.190786   41168 main.go:141] libmachine: Using API Version  1
	I0717 17:54:07.190831   41168 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:07.191124   41168 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:07.191310   41168 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:54:07.192855   41168 status.go:330] ha-333994 host status = "Running" (err=<nil>)
	I0717 17:54:07.192869   41168 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:54:07.193141   41168 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:07.193175   41168 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:07.208477   41168 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34365
	I0717 17:54:07.208902   41168 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:07.209399   41168 main.go:141] libmachine: Using API Version  1
	I0717 17:54:07.209422   41168 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:07.209691   41168 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:07.209886   41168 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:54:07.212681   41168 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:54:07.213097   41168 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:21 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:54:07.213124   41168 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:54:07.213229   41168 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:54:07.213534   41168 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:07.213571   41168 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:07.228211   41168 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44271
	I0717 17:54:07.228684   41168 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:07.229168   41168 main.go:141] libmachine: Using API Version  1
	I0717 17:54:07.229189   41168 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:07.229550   41168 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:07.229703   41168 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:54:07.229913   41168 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:54:07.229939   41168 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:54:07.232920   41168 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:54:07.233374   41168 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:21 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:54:07.233404   41168 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:54:07.233597   41168 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:54:07.233784   41168 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:54:07.233974   41168 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:54:07.234131   41168 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:54:07.317601   41168 ssh_runner.go:195] Run: systemctl --version
	I0717 17:54:07.324017   41168 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:54:07.337615   41168 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:54:07.337644   41168 api_server.go:166] Checking apiserver status ...
	I0717 17:54:07.337677   41168 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 17:54:07.351519   41168 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1346/cgroup
	W0717 17:54:07.361924   41168 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1346/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:54:07.361980   41168 ssh_runner.go:195] Run: ls
	I0717 17:54:07.366216   41168 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0717 17:54:07.370218   41168 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0717 17:54:07.370241   41168 status.go:422] ha-333994 apiserver status = Running (err=<nil>)
	I0717 17:54:07.370253   41168 status.go:257] ha-333994 status: &{Name:ha-333994 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 17:54:07.370278   41168 status.go:255] checking status of ha-333994-m02 ...
	I0717 17:54:07.370557   41168 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:07.370598   41168 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:07.385158   41168 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45429
	I0717 17:54:07.385523   41168 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:07.385972   41168 main.go:141] libmachine: Using API Version  1
	I0717 17:54:07.385994   41168 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:07.386340   41168 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:07.386527   41168 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:54:07.387829   41168 status.go:330] ha-333994-m02 host status = "Running" (err=<nil>)
	I0717 17:54:07.387845   41168 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:54:07.388144   41168 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:07.388177   41168 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:07.402680   41168 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35339
	I0717 17:54:07.403103   41168 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:07.403543   41168 main.go:141] libmachine: Using API Version  1
	I0717 17:54:07.403563   41168 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:07.403853   41168 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:07.404030   41168 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:54:07.406684   41168 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:54:07.407137   41168 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:54:07.407167   41168 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:54:07.407277   41168 host.go:66] Checking if "ha-333994-m02" exists ...
	I0717 17:54:07.407565   41168 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:07.407595   41168 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:07.421806   41168 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43149
	I0717 17:54:07.422212   41168 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:07.422702   41168 main.go:141] libmachine: Using API Version  1
	I0717 17:54:07.422727   41168 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:07.423026   41168 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:07.423213   41168 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:54:07.423381   41168 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 17:54:07.423401   41168 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:54:07.426237   41168 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:54:07.426622   41168 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:54:07.426647   41168 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:54:07.426812   41168 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:54:07.426985   41168 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:54:07.427098   41168 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:54:07.427219   41168 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:54:07.506046   41168 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 17:54:07.522214   41168 kubeconfig.go:125] found "ha-333994" server: "https://192.168.39.254:8443"
	I0717 17:54:07.522239   41168 api_server.go:166] Checking apiserver status ...
	I0717 17:54:07.522277   41168 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0717 17:54:07.535055   41168 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:54:07.535078   41168 status.go:422] ha-333994-m02 apiserver status = Stopped (err=<nil>)
	I0717 17:54:07.535089   41168 status.go:257] ha-333994-m02 status: &{Name:ha-333994-m02 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:495: failed to run minikube status. args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/DeleteSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DeleteSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.64125723s)
helpers_test.go:252: TestMultiControlPlane/serial/DeleteSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node start m02 -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-333994 -v=7               | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:46 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-333994 -v=7                    | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:46 UTC | 17 Jul 24 17:49 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-333994 --wait=true -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:49 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-333994                    | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:53 UTC |                     |
	| node    | ha-333994 node delete m03 -v=7       | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:53 UTC | 17 Jul 24 17:54 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:49:11
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:49:11.274843   39794 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:49:11.274995   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275005   39794 out.go:304] Setting ErrFile to fd 2...
	I0717 17:49:11.275011   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275192   39794 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:49:11.275748   39794 out.go:298] Setting JSON to false
	I0717 17:49:11.276624   39794 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":5494,"bootTime":1721233057,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:49:11.276685   39794 start.go:139] virtualization: kvm guest
	I0717 17:49:11.279428   39794 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:49:11.280920   39794 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:49:11.280939   39794 notify.go:220] Checking for updates...
	I0717 17:49:11.284081   39794 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:49:11.285572   39794 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:11.286973   39794 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:49:11.288259   39794 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:49:11.289617   39794 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:49:11.291360   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:11.291471   39794 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:49:11.291860   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.291910   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.306389   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41441
	I0717 17:49:11.306830   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.307340   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.307365   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.307652   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.307877   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.342518   39794 out.go:177] * Using the kvm2 driver based on existing profile
	I0717 17:49:11.343905   39794 start.go:297] selected driver: kvm2
	I0717 17:49:11.343922   39794 start.go:901] validating driver "kvm2" against &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false
ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.344074   39794 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:49:11.344385   39794 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.344460   39794 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:49:11.359473   39794 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:49:11.360126   39794 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:49:11.360191   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:11.360203   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:11.360258   39794 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false i
stio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.360356   39794 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.362215   39794 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:49:11.363497   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:11.363528   39794 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:49:11.363538   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:11.363621   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:11.363633   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:11.363751   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:11.363927   39794 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:11.363968   39794 start.go:364] duration metric: took 23.038µs to acquireMachinesLock for "ha-333994"
	I0717 17:49:11.363985   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:11.363995   39794 fix.go:54] fixHost starting: 
	I0717 17:49:11.364238   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.364269   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.378515   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45003
	I0717 17:49:11.378994   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.379458   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.379478   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.379772   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.379977   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.380153   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:11.381889   39794 fix.go:112] recreateIfNeeded on ha-333994: state=Stopped err=<nil>
	I0717 17:49:11.381920   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	W0717 17:49:11.382061   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:11.384353   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994" ...
	I0717 17:49:11.386332   39794 main.go:141] libmachine: (ha-333994) Calling .Start
	I0717 17:49:11.386525   39794 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:49:11.387295   39794 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:49:11.387605   39794 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:49:11.387902   39794 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:49:11.388700   39794 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:49:12.581316   39794 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:49:12.582199   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.582613   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.582685   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.582591   39823 retry.go:31] will retry after 292.960023ms: waiting for machine to come up
	I0717 17:49:12.877268   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.877833   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.877861   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.877756   39823 retry.go:31] will retry after 283.500887ms: waiting for machine to come up
	I0717 17:49:13.163417   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.163805   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.163826   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.163761   39823 retry.go:31] will retry after 385.368306ms: waiting for machine to come up
	I0717 17:49:13.550406   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.550840   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.550897   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.550822   39823 retry.go:31] will retry after 528.571293ms: waiting for machine to come up
	I0717 17:49:14.080602   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.081093   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.081118   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.081048   39823 retry.go:31] will retry after 736.772802ms: waiting for machine to come up
	I0717 17:49:14.818924   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.819326   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.819347   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.819281   39823 retry.go:31] will retry after 776.986347ms: waiting for machine to come up
	I0717 17:49:15.598237   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:15.598607   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:15.598627   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:15.598573   39823 retry.go:31] will retry after 1.036578969s: waiting for machine to come up
	I0717 17:49:16.637046   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:16.637440   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:16.637463   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:16.637404   39823 retry.go:31] will retry after 1.055320187s: waiting for machine to come up
	I0717 17:49:17.694838   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:17.695248   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:17.695273   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:17.695211   39823 retry.go:31] will retry after 1.335817707s: waiting for machine to come up
	I0717 17:49:19.032835   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:19.033306   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:19.033330   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:19.033266   39823 retry.go:31] will retry after 1.730964136s: waiting for machine to come up
	I0717 17:49:20.766254   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:20.766740   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:20.766768   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:20.766694   39823 retry.go:31] will retry after 2.796619276s: waiting for machine to come up
	I0717 17:49:23.566195   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:23.566759   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:23.566784   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:23.566716   39823 retry.go:31] will retry after 3.008483388s: waiting for machine to come up
	I0717 17:49:26.576866   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:26.577295   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:26.577318   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:26.577242   39823 retry.go:31] will retry after 2.889284576s: waiting for machine to come up
	I0717 17:49:29.467942   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468316   39794 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:49:29.468337   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468346   39794 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:49:29.468737   39794 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:49:29.468757   39794 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:49:29.468777   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.468804   39794 main.go:141] libmachine: (ha-333994) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"}
	I0717 17:49:29.468820   39794 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:49:29.470695   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471026   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.471058   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471199   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:49:29.471226   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:49:29.471255   39794 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:29.471268   39794 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:49:29.471282   39794 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:49:29.598374   39794 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:29.598754   39794 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:49:29.599414   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.601913   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602312   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.602351   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602634   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:29.602858   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:29.602888   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:29.603106   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.605092   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605423   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.605446   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605613   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.605754   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.605900   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.606023   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.606203   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.606385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.606396   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:29.714755   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:29.714801   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715040   39794 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:49:29.715065   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715237   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.717642   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.717930   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.717959   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.718110   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.718285   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718413   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718528   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.718679   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.718838   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.718848   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:49:29.840069   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:49:29.840100   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.842822   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843208   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.843233   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843392   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.843581   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843706   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843878   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.844054   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.844256   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.844272   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:29.959423   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:29.959450   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:29.959474   39794 buildroot.go:174] setting up certificates
	I0717 17:49:29.959488   39794 provision.go:84] configureAuth start
	I0717 17:49:29.959495   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.959790   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.962162   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962537   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.962563   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.964777   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965084   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.965116   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965226   39794 provision.go:143] copyHostCerts
	I0717 17:49:29.965266   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965305   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:29.965317   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965397   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:29.965507   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965534   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:29.965544   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965581   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:29.965639   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965671   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:29.965680   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965714   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:29.965774   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:49:30.057325   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:30.057377   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:30.057400   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.059825   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060114   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.060140   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060281   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.060451   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.060561   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.060675   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.146227   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:30.146289   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:30.174390   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:30.174450   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:30.202477   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:30.202541   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0717 17:49:30.229907   39794 provision.go:87] duration metric: took 270.408982ms to configureAuth
	I0717 17:49:30.229929   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:30.230164   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:30.230177   39794 machine.go:97] duration metric: took 627.307249ms to provisionDockerMachine
	I0717 17:49:30.230186   39794 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:49:30.230200   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:30.230227   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.230520   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:30.230554   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.233026   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233363   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.233390   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233521   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.233700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.233828   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.233952   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.318669   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:30.323112   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:30.323131   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:30.323180   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:30.323246   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:30.323258   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:30.323348   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:30.334564   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:30.360407   39794 start.go:296] duration metric: took 130.206138ms for postStartSetup
	I0717 17:49:30.360441   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.360727   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:30.360774   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.362968   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363308   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.363334   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363435   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.363609   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.363749   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.363862   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.448825   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:30.448901   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:30.490930   39794 fix.go:56] duration metric: took 19.126931057s for fixHost
	I0717 17:49:30.490966   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.493716   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494056   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.494081   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494261   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.494473   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494636   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494816   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.495007   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:30.495221   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:30.495236   39794 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:49:30.611220   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238570.579395854
	
	I0717 17:49:30.611243   39794 fix.go:216] guest clock: 1721238570.579395854
	I0717 17:49:30.611255   39794 fix.go:229] Guest: 2024-07-17 17:49:30.579395854 +0000 UTC Remote: 2024-07-17 17:49:30.49095133 +0000 UTC m=+19.250883626 (delta=88.444524ms)
	I0717 17:49:30.611271   39794 fix.go:200] guest clock delta is within tolerance: 88.444524ms
	I0717 17:49:30.611277   39794 start.go:83] releasing machines lock for "ha-333994", held for 19.24729888s
	I0717 17:49:30.611293   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.611569   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:30.613990   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614318   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.614355   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614483   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.614909   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615067   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615169   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:30.615215   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.615255   39794 ssh_runner.go:195] Run: cat /version.json
	I0717 17:49:30.615275   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.617353   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617676   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.617702   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617734   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617863   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618049   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618146   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.618173   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.618217   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618306   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618370   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.618445   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618555   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618672   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.694919   39794 ssh_runner.go:195] Run: systemctl --version
	I0717 17:49:30.721823   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:49:30.727892   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:30.727967   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:30.745249   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:30.745272   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:30.745332   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:30.784101   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:30.798192   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:30.798265   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:30.811458   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:30.824815   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:30.938731   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:31.081893   39794 docker.go:233] disabling docker service ...
	I0717 17:49:31.081980   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:31.097028   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:31.110328   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:31.242915   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:31.365050   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:31.379135   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:31.400136   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:31.412561   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:31.425082   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:31.425159   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:31.437830   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.450453   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:31.462175   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.473289   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:31.484541   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:31.495502   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:31.506265   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:31.518840   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:31.530158   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:31.530208   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:31.548502   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:31.563431   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:31.674043   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:31.701907   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:31.702006   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:31.706668   39794 retry.go:31] will retry after 920.793788ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:32.627794   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:32.632953   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:32.633009   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:32.636846   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:32.677947   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:32.678013   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.709490   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.738106   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:32.739529   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:32.742040   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742375   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:32.742405   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742590   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:32.746706   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.759433   39794 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingre
ss:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:do
cker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:49:32.759609   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:32.759661   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.792410   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.792432   39794 containerd.go:534] Images already preloaded, skipping extraction
	I0717 17:49:32.792483   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.824536   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.824558   39794 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:49:32.824565   39794 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:49:32.824675   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:32.824722   39794 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:49:32.856864   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:32.856886   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:32.856893   39794 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:49:32.856917   39794 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:49:32.857032   39794 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:49:32.857054   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:32.857090   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:32.875326   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:32.875456   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:32.875511   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:32.885386   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:32.885459   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:49:32.895011   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:49:32.913107   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:32.929923   39794 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:49:32.946336   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:32.962757   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:32.966796   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.979550   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:33.092357   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:33.111897   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:49:33.111921   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:33.111940   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.112113   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:33.112206   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:33.112225   39794 certs.go:256] generating profile certs ...
	I0717 17:49:33.112347   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:33.112383   39794 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1
	I0717 17:49:33.112401   39794 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:49:33.337392   39794 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 ...
	I0717 17:49:33.337432   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1: {Name:mkfeb2a5adc7d732ca48854394be4077f3b9b81e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337612   39794 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 ...
	I0717 17:49:33.337630   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1: {Name:mk17811291d2c587100f8fbd5f0c9c2d641ddf76 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337728   39794 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:49:33.337924   39794 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:49:33.338098   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:33.338134   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:33.338154   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:33.338172   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:33.338188   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:33.338203   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:33.338221   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:33.338239   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:33.338253   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:33.338313   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:33.338354   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:33.338363   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:33.338391   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:33.338431   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:33.338457   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:33.338511   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:33.338549   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.338570   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.338587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.339107   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:33.371116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:33.405873   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:33.442007   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:33.472442   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:33.496116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:33.527403   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:33.552684   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:33.576430   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:33.599936   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:33.623341   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:33.646635   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:49:33.663325   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:33.668872   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:33.679471   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683810   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683866   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.689677   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:33.700471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:33.710911   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715522   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715581   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.721331   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:33.731730   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:33.742074   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746374   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746417   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.751941   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:33.762070   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:33.766344   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0717 17:49:33.771976   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0717 17:49:33.777506   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0717 17:49:33.783203   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0717 17:49:33.788713   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0717 17:49:33.794346   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0717 17:49:33.800031   39794 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:
false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docke
r BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:33.800131   39794 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:49:33.800172   39794 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:49:33.836926   39794 cri.go:89] found id: "86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	I0717 17:49:33.836947   39794 cri.go:89] found id: "dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f"
	I0717 17:49:33.836952   39794 cri.go:89] found id: "5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a"
	I0717 17:49:33.836956   39794 cri.go:89] found id: "f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428"
	I0717 17:49:33.836959   39794 cri.go:89] found id: "0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45"
	I0717 17:49:33.836963   39794 cri.go:89] found id: "2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	I0717 17:49:33.836967   39794 cri.go:89] found id: "d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411"
	I0717 17:49:33.836970   39794 cri.go:89] found id: "2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c"
	I0717 17:49:33.836974   39794 cri.go:89] found id: "5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46"
	I0717 17:49:33.836981   39794 cri.go:89] found id: "515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697"
	I0717 17:49:33.836985   39794 cri.go:89] found id: ""
	I0717 17:49:33.837036   39794 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0717 17:49:33.850888   39794 cri.go:116] JSON = null
	W0717 17:49:33.850933   39794 kubeadm.go:399] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0717 17:49:33.851001   39794 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:49:33.861146   39794 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0717 17:49:33.861164   39794 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0717 17:49:33.861204   39794 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0717 17:49:33.870180   39794 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:49:33.870557   39794 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-333994" does not appear in /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.870654   39794 kubeconfig.go:62] /home/jenkins/minikube-integration/19283-14409/kubeconfig needs updating (will repair): [kubeconfig missing "ha-333994" cluster setting kubeconfig missing "ha-333994" context setting]
	I0717 17:49:33.870894   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.871258   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.871471   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.180:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:49:33.871875   39794 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:49:33.872033   39794 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0717 17:49:33.881089   39794 kubeadm.go:630] The running cluster does not require reconfiguration: 192.168.39.180
	I0717 17:49:33.881107   39794 kubeadm.go:597] duration metric: took 19.938705ms to restartPrimaryControlPlane
	I0717 17:49:33.881113   39794 kubeadm.go:394] duration metric: took 81.089134ms to StartCluster
	I0717 17:49:33.881124   39794 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881175   39794 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.881658   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881845   39794 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:33.881872   39794 start.go:241] waiting for startup goroutines ...
	I0717 17:49:33.881879   39794 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:49:33.882084   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.884129   39794 out.go:177] * Enabled addons: 
	I0717 17:49:33.885737   39794 addons.go:510] duration metric: took 3.853682ms for enable addons: enabled=[]
	I0717 17:49:33.885760   39794 start.go:246] waiting for cluster config update ...
	I0717 17:49:33.885767   39794 start.go:255] writing updated cluster config ...
	I0717 17:49:33.887338   39794 out.go:177] 
	I0717 17:49:33.888767   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.888845   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.890338   39794 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:49:33.891461   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:33.891475   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:33.891543   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:33.891554   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:33.891626   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.891771   39794 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:33.891806   39794 start.go:364] duration metric: took 19.128µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:49:33.891819   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:33.891826   39794 fix.go:54] fixHost starting: m02
	I0717 17:49:33.892056   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:33.892076   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:33.906264   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44047
	I0717 17:49:33.906599   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:33.907064   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:33.907083   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:33.907400   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:33.907566   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:33.907713   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:49:33.909180   39794 fix.go:112] recreateIfNeeded on ha-333994-m02: state=Stopped err=<nil>
	I0717 17:49:33.909199   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	W0717 17:49:33.909338   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:33.911077   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994-m02" ...
	I0717 17:49:33.912122   39794 main.go:141] libmachine: (ha-333994-m02) Calling .Start
	I0717 17:49:33.912246   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:49:33.912879   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:49:33.913156   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:49:33.913539   39794 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:49:33.914190   39794 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:49:35.092192   39794 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:49:35.092951   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.093269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.093360   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.093273   39957 retry.go:31] will retry after 192.383731ms: waiting for machine to come up
	I0717 17:49:35.287679   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.288078   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.288104   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.288046   39957 retry.go:31] will retry after 385.654698ms: waiting for machine to come up
	I0717 17:49:35.675666   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.676036   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.676064   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.675991   39957 retry.go:31] will retry after 420.16772ms: waiting for machine to come up
	I0717 17:49:36.097264   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.097632   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.097689   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.097608   39957 retry.go:31] will retry after 593.383084ms: waiting for machine to come up
	I0717 17:49:36.692388   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.692779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.692805   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.692748   39957 retry.go:31] will retry after 522.894623ms: waiting for machine to come up
	I0717 17:49:37.217539   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.217939   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.217974   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.217901   39957 retry.go:31] will retry after 618.384823ms: waiting for machine to come up
	I0717 17:49:37.837779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.838175   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.838200   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.838142   39957 retry.go:31] will retry after 1.091652031s: waiting for machine to come up
	I0717 17:49:38.931763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:38.932219   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:38.932247   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:38.932134   39957 retry.go:31] will retry after 1.341674427s: waiting for machine to come up
	I0717 17:49:40.275320   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:40.275792   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:40.275820   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:40.275754   39957 retry.go:31] will retry after 1.293235927s: waiting for machine to come up
	I0717 17:49:41.571340   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:41.571705   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:41.571732   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:41.571661   39957 retry.go:31] will retry after 1.542371167s: waiting for machine to come up
	I0717 17:49:43.115333   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:43.115796   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:43.115826   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:43.115760   39957 retry.go:31] will retry after 1.886589943s: waiting for machine to come up
	I0717 17:49:45.004358   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:45.004727   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:45.004763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:45.004693   39957 retry.go:31] will retry after 2.72551249s: waiting for machine to come up
	I0717 17:49:47.733475   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:47.733874   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:47.733902   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:47.733829   39957 retry.go:31] will retry after 3.239443396s: waiting for machine to come up
	I0717 17:49:50.975432   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.975912   39794 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:49:50.975930   39794 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:49:50.975960   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.976436   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.976461   39794 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:49:50.976480   39794 main.go:141] libmachine: (ha-333994-m02) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"}
	I0717 17:49:50.976499   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:49:50.976514   39794 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:49:50.978829   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979226   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.979246   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979387   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:49:50.979411   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:49:50.979431   39794 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:50.979444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:49:50.979455   39794 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:49:51.106070   39794 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:51.106413   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:49:51.106973   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.109287   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.109618   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109826   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:51.110023   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:51.110040   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.110237   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.112084   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112321   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.112346   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112436   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.112578   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112724   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112869   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.113027   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.113194   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.113205   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:51.214365   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:51.214388   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214600   39794 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:49:51.214629   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.217146   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217465   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.217489   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217600   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.217758   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.217934   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.218049   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.218223   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.218385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.218401   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:49:51.334279   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:49:51.334317   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.337581   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.337905   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.337933   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.338139   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.338346   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338512   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338693   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.338845   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.339025   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.339046   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:51.454925   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:51.454956   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:51.454978   39794 buildroot.go:174] setting up certificates
	I0717 17:49:51.454987   39794 provision.go:84] configureAuth start
	I0717 17:49:51.454999   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.455257   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.457564   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.457851   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.457873   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.458013   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.459810   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460165   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.460190   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460306   39794 provision.go:143] copyHostCerts
	I0717 17:49:51.460327   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460352   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:51.460360   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460411   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:51.460474   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460493   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:51.460497   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460514   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:51.460556   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460571   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:51.460577   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460593   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:51.460641   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:49:51.635236   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:51.635286   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:51.635308   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.638002   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638369   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.638395   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638622   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.638815   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.638982   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.639145   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.720405   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:51.720478   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:51.746352   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:51.746412   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:49:51.770628   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:51.770702   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:51.795258   39794 provision.go:87] duration metric: took 340.256082ms to configureAuth
	I0717 17:49:51.795284   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:51.795490   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:51.795501   39794 machine.go:97] duration metric: took 685.467301ms to provisionDockerMachine
	I0717 17:49:51.795514   39794 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:49:51.795528   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:51.795563   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.795850   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:51.795874   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.798310   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798696   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.798719   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798889   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.799047   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.799191   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.799286   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.881403   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:51.885516   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:51.885542   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:51.885603   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:51.885687   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:51.885697   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:51.885773   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:51.894953   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:51.919442   39794 start.go:296] duration metric: took 123.913575ms for postStartSetup
	I0717 17:49:51.919487   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.919775   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:51.919801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.922159   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922506   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.922533   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922672   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.922878   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.923036   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.923152   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.004408   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:52.004481   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:52.063014   39794 fix.go:56] duration metric: took 18.171175537s for fixHost
	I0717 17:49:52.063061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.065858   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066239   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.066269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066459   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.066648   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066806   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066931   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.067086   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:52.067288   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:52.067303   39794 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:49:52.166802   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238592.140235525
	
	I0717 17:49:52.166826   39794 fix.go:216] guest clock: 1721238592.140235525
	I0717 17:49:52.166835   39794 fix.go:229] Guest: 2024-07-17 17:49:52.140235525 +0000 UTC Remote: 2024-07-17 17:49:52.063042834 +0000 UTC m=+40.822975139 (delta=77.192691ms)
	I0717 17:49:52.166849   39794 fix.go:200] guest clock delta is within tolerance: 77.192691ms
	I0717 17:49:52.166853   39794 start.go:83] releasing machines lock for "ha-333994-m02", held for 18.275039229s
	I0717 17:49:52.166873   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.167105   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:52.169592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.169924   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.169948   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.172181   39794 out.go:177] * Found network options:
	I0717 17:49:52.173607   39794 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:49:52.174972   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.175003   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175597   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175781   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175858   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:52.175897   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:49:52.175951   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.176007   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:49:52.176024   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.178643   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.178748   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179072   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179098   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179230   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179248   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179272   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179432   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179524   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179596   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179664   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179721   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179794   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.179844   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:49:52.256371   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:52.256433   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:52.287825   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:52.287848   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:52.287901   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:52.316497   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:52.330140   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:52.330189   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:52.343721   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:52.357273   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:52.483050   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:52.682504   39794 docker.go:233] disabling docker service ...
	I0717 17:49:52.682571   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:52.702383   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:52.717022   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:52.851857   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:52.989407   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:53.003913   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:53.024876   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:53.035470   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:53.046129   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:53.046184   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:53.056553   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.067211   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:53.077626   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.088680   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:53.100371   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:53.111920   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:53.123072   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:53.133713   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:53.143333   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:53.143405   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:53.157890   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:53.167934   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:53.302893   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:53.333425   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:53.333488   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:53.339060   39794 retry.go:31] will retry after 1.096332725s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:54.435963   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:54.441531   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:54.441599   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:54.445786   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:54.483822   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:54.483877   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.518845   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.553079   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:54.554649   39794 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:49:54.556061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:54.559046   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559422   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:54.559444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559695   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:54.564470   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:54.579269   39794 mustload.go:65] Loading cluster: ha-333994
	I0717 17:49:54.579483   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:54.579765   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.579792   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.594439   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39001
	I0717 17:49:54.594883   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.595350   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.595374   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.595675   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.595858   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:54.597564   39794 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:49:54.597896   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.597921   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.613634   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34405
	I0717 17:49:54.614031   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.614493   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.614511   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.614816   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.615002   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:54.615153   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:49:54.615165   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:54.615183   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:54.615314   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:54.615354   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:54.615363   39794 certs.go:256] generating profile certs ...
	I0717 17:49:54.615452   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:54.615493   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:49:54.615524   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:54.615535   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:54.615548   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:54.615560   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:54.615575   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:54.615587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:54.615599   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:54.615635   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:54.615651   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:54.615692   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:54.615716   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:54.615731   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:54.615754   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:54.615774   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:54.615795   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:54.615829   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:54.615854   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:54.615866   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:54.615877   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:54.615902   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:54.618791   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619169   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:54.619191   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619351   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:54.619524   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:54.619660   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:54.619789   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:54.694549   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:49:54.699693   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:49:54.711136   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:49:54.715759   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:49:54.727707   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:49:54.732038   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:49:54.743206   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:49:54.747536   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:49:54.759182   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:49:54.763279   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:49:54.774195   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:49:54.778345   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:49:54.790000   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:54.817482   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:54.842528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:54.867521   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:54.893528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:54.920674   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:54.946673   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:54.972385   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:54.997675   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:55.023298   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:55.048552   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:55.073345   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:49:55.091193   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:49:55.108383   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:49:55.125529   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:49:55.142804   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:49:55.160482   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:49:55.178995   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:49:55.197026   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:55.202998   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:55.214662   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219373   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219447   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.225441   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:55.236543   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:55.247672   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252336   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252396   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.258207   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:55.269215   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:55.280136   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284763   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284843   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.290471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:55.301174   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:55.305201   39794 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:49:55.305253   39794 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:49:55.305343   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:55.305377   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:55.305412   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:55.322820   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:55.322885   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:55.322938   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:55.332945   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:55.333009   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0717 17:49:55.342555   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0717 17:49:55.358883   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:55.375071   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:55.393413   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:55.397331   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:55.411805   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.535806   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:55.554620   39794 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:55.554913   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:55.556751   39794 out.go:177] * Verifying Kubernetes components...
	I0717 17:49:55.558066   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.748334   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:56.613699   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:56.613920   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0717 17:49:56.613970   39794 kubeadm.go:483] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.180:8443
	I0717 17:49:56.614170   39794 node_ready.go:35] waiting up to 6m0s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:49:56.614265   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:56.614272   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:56.614280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:56.614286   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:56.627325   39794 round_trippers.go:574] Response Status: 404 Not Found in 13 milliseconds
	I0717 17:49:57.115057   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.115083   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.115091   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.115095   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.117582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:57.614333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.614354   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.614362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.614365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.616581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.117636   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.615397   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.615423   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.615434   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.617780   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.617919   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:49:59.114753   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.114774   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.116989   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:59.615261   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.615289   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.615299   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.615305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.617539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.115327   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.115348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.115356   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.115359   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.117595   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.615335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.615371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:01.115332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.115360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.115364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.118462   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:01.118555   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:01.614396   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.614425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.614429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.616688   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.115381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.115413   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.115424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.115429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.117845   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.614519   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.616973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.114666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.114690   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.114706   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.114711   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.614478   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.614500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.614508   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.616763   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.616861   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:04.115079   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.115103   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.115116   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.117400   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:04.614899   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.614932   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.614936   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.617138   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.115001   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.115024   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.115031   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.115039   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.117375   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.615153   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.615158   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.617472   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.617581   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:06.115206   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.115235   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.115240   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.117694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:06.614430   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.614453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.614462   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.614467   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.616849   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.115357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.115378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.115386   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.117909   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.614460   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.614497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.617064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.115383   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.115405   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.115412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.115417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.117848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.117947   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:08.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.614415   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.614423   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.616608   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.114929   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.114950   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.114958   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.114962   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.117409   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.614639   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.614659   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.614666   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.614670   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.616904   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.114644   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.114668   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.114676   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.114685   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.117224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.614973   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.614995   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.615003   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.615007   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.617474   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:11.115160   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.115187   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.115197   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.117916   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:11.615031   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.615053   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.615061   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.615065   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.617581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.115275   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.115297   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.115305   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.615329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.615367   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.617808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.617929   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:13.114465   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.114488   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.114497   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.114501   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.116973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:13.614674   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.614704   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.614715   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.614721   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.617161   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.115360   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.117798   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.615028   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.615052   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.615062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.615068   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.617174   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.115117   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.115140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.115149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.115154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.117832   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.117958   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:15.614474   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.614528   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.614534   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.114493   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.114529   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.114536   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.114540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.117140   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.614895   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.614935   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.614943   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.617847   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.114480   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.114500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.114510   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.116841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.614484   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.614505   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.614512   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.614515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.616877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.617049   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:18.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.115346   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.115354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.115358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.117690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:18.614346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.614364   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.614372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.614377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.617203   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.114315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.114349   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.114357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.114362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.119328   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:19.614516   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.614536   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.614544   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.614549   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.616974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.617173   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:20.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.114896   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.114905   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.114908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.117228   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:20.614953   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.614974   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.614981   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.614987   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.617553   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.115256   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.115288   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.115297   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.117516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.614470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.614493   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.614504   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.616801   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.114458   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.114481   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.114491   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.114497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.116704   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.116814   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:22.614361   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.614383   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.614395   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.616868   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.117765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.614469   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.614480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.614486   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.616902   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.115254   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.115277   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.115287   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.115292   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.117319   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.117422   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:24.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.614655   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.614665   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.614669   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.617182   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:25.115401   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.115430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.115434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.118835   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:25.614325   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.614351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.614366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.616764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.114413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.114451   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.114460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.117000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.614789   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.614815   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.614826   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.614831   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.617192   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.617279   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:27.114863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.114888   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.114897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.114903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.117792   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:27.615352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.615378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.615389   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.615394   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.618057   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.115330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.115353   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.115365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.117820   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.615355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.615377   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.615385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.615389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.619637   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:28.619765   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:29.114706   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.114727   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.114734   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.114738   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.117064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:29.614803   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.614826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.614835   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.617436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.114527   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.114565   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.614542   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.614551   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.614554   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.114819   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.114856   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.114867   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.114873   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.117237   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.117345   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:31.615179   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.615203   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.615219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.615224   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.617525   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.115329   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.115337   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.115341   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.117639   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.614391   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.614403   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.617172   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.115127   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.115162   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.117796   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.117911   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:33.614544   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.614586   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.614597   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.614611   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.616706   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.115175   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.115197   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.117345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.614352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.614380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.614384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.616826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.114840   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.114867   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.114876   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.114881   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.117298   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.615140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.617897   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:36.115372   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.115393   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.115402   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.115405   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.117735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:36.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.615376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.615388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.617891   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:37.114533   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.114567   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.114572   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:37.615384   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.615406   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.615414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.615417   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.617760   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.114425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.114448   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.114455   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.114458   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.117016   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.117135   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:38.614755   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.614779   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.614787   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.614790   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.617099   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.115282   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.115303   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.115311   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.115315   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.117895   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.614832   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.614853   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.614865   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.617355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.115339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.115361   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.115369   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.117661   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:40.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.614396   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.616881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.114581   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.114606   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.114616   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.114622   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.116877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.614884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.614906   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.614914   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.614919   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.115181   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.115193   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.115201   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.117819   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:42.614328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.614348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.614356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.617382   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:43.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.115127   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.115135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.115140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.117355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:43.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.615142   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.617549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.114826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.114834   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.114839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.117204   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.615431   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.615439   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.617856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.617969   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:45.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.115093   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.117220   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:45.614988   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.615008   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.615015   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.615018   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.617421   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.115178   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.615076   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.617407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.117871   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.117975   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:47.614555   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.614577   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.614586   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.617103   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.114743   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.116997   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.614683   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.614710   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.614721   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.614734   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.617185   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.115307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.115332   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.115343   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.115347   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.117646   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.614838   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.614858   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.614872   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.614880   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.617342   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.617440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:50.115333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.115372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.117536   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:50.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.615270   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.615278   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.615282   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.617747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.114366   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.114389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.114396   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.114400   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.116597   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.614397   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.614401   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.616747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.114431   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.114453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.114461   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.117470   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:52.615088   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.615111   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.615118   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.615122   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.617416   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.115203   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.115208   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.117683   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.614356   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.614376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.614384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.614388   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.616703   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.114990   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.115013   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.115020   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.115024   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.117855   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.117941   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:54.615104   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.615125   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.615135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.615140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.114983   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.115012   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.117396   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.615131   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.615152   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.615168   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.617453   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.115180   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.115201   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.115209   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.117326   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.615051   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.615074   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.615082   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.615087   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.617369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.617480   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:57.115080   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.115102   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.115110   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.115114   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.117510   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:57.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.615246   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.615254   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.615258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.617511   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.114811   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.117265   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.614995   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.615015   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.615023   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.615028   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.617145   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.115342   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.115350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.115353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.117772   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.117893   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:59.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.614906   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.617194   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.115270   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.115304   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.117653   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.615391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.617720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.114385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.114413   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.114416   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.116717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.614708   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.614735   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.614745   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.614751   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.617211   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.617309   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:02.114916   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.114948   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.114956   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.114965   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.117244   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:02.614964   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.614987   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.614995   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.614999   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.617512   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.115239   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.115251   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.117907   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.614525   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.614547   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.614557   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.614561   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.621322   39794 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0717 17:51:03.621424   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:04.114491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.114513   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.116543   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:04.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.614699   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.614705   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.616831   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.114969   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.114996   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.115003   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.115008   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.117465   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.615208   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.615240   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.617689   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.114340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.114360   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.114368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.114372   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.116445   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.116590   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:06.615129   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.615165   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.615172   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.115324   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.117841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.614530   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.614557   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.614566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.614570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.617073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.114714   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.114750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.114756   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.117056   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.117161   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:08.615333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.615360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.617848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:09.114938   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.114965   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.114974   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.114980   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.118060   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:09.615157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.615177   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.615192   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.617894   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.115084   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.115104   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.115112   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.115120   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.117391   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.117508   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:10.615120   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.615155   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.615161   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.617842   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.114485   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.114507   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.114515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.114520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.117245   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.615400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.615426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.615437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.617790   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.115351   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.117803   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.117915   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:12.614461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.614485   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.614495   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.614500   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.617208   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.114980   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.115020   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.117385   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.615122   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.615160   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.615166   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.617805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.115212   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.115244   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.115253   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.115258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.117528   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.614681   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.614701   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.614711   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.614717   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.617113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.617211   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:15.115267   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.115291   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.115302   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.115309   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.117537   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:15.615307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.615331   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.615340   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.615345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.617660   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.115400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.115426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.115437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.115444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.118040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.614698   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.614703   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.617162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.617258   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:17.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.114853   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.114868   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.117547   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:17.615274   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.615316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.615323   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.617344   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.115064   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.115086   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.115097   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.115101   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.117232   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.614999   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.615021   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.615032   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.615037   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.617285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.617392   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:19.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.114451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.117257   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:19.615315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.615344   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.617155   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:20.115264   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.115284   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.115292   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.115296   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.117412   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:20.615133   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.615162   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.615165   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.616967   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:21.114603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.114639   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.114648   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.114655   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.116866   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:21.116957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:21.614816   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.614841   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.614850   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.614854   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.115139   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.115162   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.115170   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.115174   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.614412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.614434   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.614440   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.614444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.617178   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.114352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.114377   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.114388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.114392   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.116563   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.615345   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.615372   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.615380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.618002   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.618112   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:24.115378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.115401   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.115418   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.117758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:24.614891   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.614912   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.614926   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.617332   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.115412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.115436   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.115445   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.115448   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.117910   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.614339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.614363   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.614371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.614375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.617451   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:26.115183   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.115207   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.115219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.115225   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.117163   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:26.117274   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:26.614942   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.614966   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.614977   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.614984   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.617676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.115347   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.115370   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.115380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.117861   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.615350   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.615359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.618250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.114551   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.114569   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.114577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.114583   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.117333   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.117440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:28.615148   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.615180   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.615191   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.615196   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:29.114764   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.114789   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.114800   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.114804   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:29.615144   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.615168   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.615180   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.615195   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.114670   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.114678   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.116515   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:30.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.615265   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.615273   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.617998   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.618150   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:31.115373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.115395   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.115403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.115407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.117657   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:31.614754   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.614781   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.614789   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.616938   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.115334   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.115357   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.115370   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.117890   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.614529   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.614551   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.614559   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.614563   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.617063   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.114739   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.114762   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.114769   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.114773   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.116876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.116968   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:33.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.614566   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.614574   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.614579   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.616992   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.115382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.115403   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.115414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.117715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.614863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.614888   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.617243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.115352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.115375   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.117853   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.117957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:35.614511   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.614533   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.614541   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.614547   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.617000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.114661   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.114682   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.114690   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.114695   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.117055   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.614908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.617081   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.114747   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.117323   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.615075   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.617571   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.617677   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:38.115271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.117337   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:38.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.615136   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.615143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.615146   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.617524   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.114717   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.114726   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.114731   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.116906   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.615059   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.615078   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.615086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.615090   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:40.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.114645   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.114655   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.114659   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.116637   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:40.116742   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:40.615346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.615368   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.615379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.615385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.617774   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.114470   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.114474   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.116924   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.614862   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.614882   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.614890   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.617121   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.114844   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.114871   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.114880   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.114887   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.117456   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.117549   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:42.615184   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.615219   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.615228   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.615231   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.617697   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.115377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.117888   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.614542   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.614564   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.614572   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.614575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.617156   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.114390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.114418   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.114430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.116806   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.614781   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.614808   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.614813   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.616969   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.617103   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:45.115008   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.115031   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.117431   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:45.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.615252   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.615262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.615266   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.617533   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.115209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.115230   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.115243   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.118193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.614898   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.614921   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.614928   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.614932   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.617234   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.617429   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:47.115009   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.115032   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.117484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:47.615213   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.615236   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.615245   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.615249   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.617602   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.115343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.117939   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.614599   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.614625   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.617112   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.117738   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.117854   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:49.614434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.614465   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.614475   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.614479   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.617641   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:50.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.115370   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.117407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:50.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.615340   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.615353   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.114398   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.114407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.114414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.116810   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.614799   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.614831   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.614844   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.617260   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.617398   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:52.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.115094   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.115102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.115108   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.117538   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:52.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.615365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.617834   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.114486   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.114512   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.118242   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:53.615003   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.615034   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.615045   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.615051   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.617826   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:54.115063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.115091   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.115100   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.115105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.117425   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:54.615271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.615304   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.615309   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.617987   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:55.115096   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.115119   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.115127   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.115131   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:55.614857   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.614897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.614903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.617711   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.115361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.118008   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.118139   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:56.614719   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.614745   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.614752   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.614756   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.617529   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.115310   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.115318   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.115321   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.117714   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.614495   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.614525   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.614528   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.616925   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.114573   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.114598   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.114609   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.114613   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.116783   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.614459   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.614476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.616956   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:59.115030   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.115055   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.115066   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.115073   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:59.615128   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.615151   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.615159   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.615164   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.617627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.114672   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.114694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.114702   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.114706   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.117073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.614975   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.614999   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.615009   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.615014   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.617143   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.617251   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:01.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.114842   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.114852   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.114858   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.117434   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:01.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.614440   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.614448   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.617018   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.114715   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.114722   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.114727   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.116963   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.614625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.614650   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.614660   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.614664   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.617042   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.114775   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.116932   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.117041   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:03.614597   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.614618   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.614630   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.616748   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.115018   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.115039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.115049   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.115053   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.117556   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.615368   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.617694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.114830   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.114857   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.114865   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.114869   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.117278   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.117380   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:05.615000   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.615035   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.615052   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.617339   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.115037   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.115056   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.115062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.115066   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.117588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.614309   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.614333   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.614341   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.614346   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.616516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.115312   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.115336   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.115345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.115349   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.117714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:07.615376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.615398   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.615406   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.617826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.114477   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.114499   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.114511   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.116889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.614611   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.614649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.115169   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.115191   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.117574   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.615328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.615357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.615361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.617889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.618007   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:10.115232   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.115254   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.115262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.115268   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.117721   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:10.614358   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.614381   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.614388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.616539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.115338   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.115377   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.115384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.117600   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.614501   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.614525   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.614535   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.614539   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.616883   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.114544   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.114552   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.114557   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.117075   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.117189   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:12.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.614866   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.617132   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.114797   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.114818   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.114830   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.114835   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.117193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.614859   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.614880   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.614887   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.614891   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.617224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.114680   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.114701   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.114708   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.114713   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.117640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:14.615371   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.615412   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.617899   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.115307   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.115316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.115320   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.615379   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.615407   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.617678   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.115368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.117508   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.615332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.615355   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.617762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.617852   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:17.115342   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.115380   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.117745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:17.614381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.614404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.614411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.614414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.616676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:18.114344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.114365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.114372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.114377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.116126   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:18.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.614859   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.614863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.617249   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.114382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.114404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.114422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.116549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.116667   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:19.615132   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.615157   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.615166   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.617897   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:20.115394   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.115438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.120626   39794 round_trippers.go:574] Response Status: 404 Not Found in 5 milliseconds
	I0717 17:52:20.615314   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.615343   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.617815   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.114476   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.114497   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.114509   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.114516   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.116694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.116789   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:21.614568   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.614590   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.614596   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.614600   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.616740   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.114465   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.114477   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.116620   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.615373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.615414   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.615422   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.615425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.115377   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.115390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.117793   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.117961   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:23.614462   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.614495   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.616758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.115153   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.115174   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.115183   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.115187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.117485   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.615251   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.615278   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.615294   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.618155   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.114648   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.114656   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.114660   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.117162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.614843   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.614863   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.614871   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.614875   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.617057   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:26.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.114665   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.114677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.116743   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:26.614490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.614512   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.614521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.614524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.616812   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.115366   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.115379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.117751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.614385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.614429   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.614436   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.614440   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.616766   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.114438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.114476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.116881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.116995   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:28.614550   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.614573   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.614583   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.616576   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:29.114665   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.114688   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.114697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.114701   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.116949   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:29.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.614647   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.614652   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.617229   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.114692   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.114711   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.114725   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.116453   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:30.615200   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.615233   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.615241   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.617947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.618078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:31.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.114663   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.114674   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.114677   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.116821   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:31.614807   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.614849   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.614857   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.617107   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.114733   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.114780   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.114784   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.117117   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.614873   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.614906   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.614913   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.617084   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.114774   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.116968   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.117056   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:33.614614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.614634   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.614642   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.614648   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.616694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.114989   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.115010   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.115019   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.115023   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.117256   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.615017   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.615039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.615049   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.617305   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.114707   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.114729   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.114737   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.114741   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.116837   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.617169   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.617264   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:36.114880   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.114903   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.114912   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.114915   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.117413   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:36.615154   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.615178   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.615189   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.617681   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.114404   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.114427   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.114438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.116709   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.614444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.614452   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.614465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.616814   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.114566   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.117012   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.117111   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:38.614715   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.614738   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.614746   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.614750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.115300   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.115321   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.115330   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.117647   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.615387   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.615412   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.615418   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.615422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.617840   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.114520   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.114548   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.114553   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.116874   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.614642   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.614667   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.614677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.614682   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.617201   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.617299   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:41.114884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.114913   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.114925   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.114930   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.117705   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:41.614760   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.614784   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.614799   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.617304   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.115055   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.115077   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.115086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.115092   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.117464   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.615207   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.617906   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:43.114443   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.114471   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.114484   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.114489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.116804   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:43.614503   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.614534   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.614546   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.616923   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.114333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.114362   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.114371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.114376   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.116593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.615353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.615375   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.615383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.615387   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.619020   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:44.619252   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:45.114535   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.114558   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.114565   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.114568   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.116805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:45.614455   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.614477   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.614485   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.614489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.616531   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.115327   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.115340   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.117430   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.615358   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.615364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.617638   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.115375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.115397   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.115405   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.115410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.117966   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.118069   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:47.614605   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.614627   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.614635   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.617373   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.115142   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.115164   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.115173   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.115177   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.117353   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.615075   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.615097   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.615105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.617317   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.114470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.114492   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.114501   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.114506   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.116813   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.617717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.617816   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:50.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.115376   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.115384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.115389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.117802   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:50.614440   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.614462   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.614474   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.616542   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:51.115295   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.115318   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.115325   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.115329   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.118739   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:51.614657   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.614694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.614703   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.614708   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.616892   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.114541   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.114568   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.114575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.114578   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.117054   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.117156   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:52.614718   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.614748   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.614759   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.614765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.114959   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.114984   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.114996   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.115000   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.117274   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.615035   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.615060   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.615070   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.617250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.114679   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.114686   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.114694   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.116952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.614585   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.614604   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.614612   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.614615   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.616959   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.617087   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:55.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.114543   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.114550   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.114556   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.117176   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:55.614804   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.614830   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.614843   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.614848   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.114710   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.114750   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.114757   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.117352   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.615042   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.615064   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.615072   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.617503   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.617629   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:57.115247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.115273   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.115283   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.115289   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.119157   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:57.614778   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.614808   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.614812   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.617771   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.114423   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.114444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.114451   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.114455   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.116940   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.614594   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.614616   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.614631   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.616901   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.114914   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.114934   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.114942   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.114945   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.117144   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.117235   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:59.614791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.614814   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.614822   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.614827   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.617115   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.115354   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.115366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.117649   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.615378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.615400   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.615411   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.615416   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.617719   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.114375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.114397   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.114404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.114408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.614966   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.614991   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.615002   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.615011   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.618973   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:01.619078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:02.114685   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.114710   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.114723   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:02.615258   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.615281   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.615293   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.115355   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.115371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.117667   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.615340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.615365   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.615374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.615379   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.617818   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.115204   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.115234   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.115238   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.117764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.117866   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:04.615339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.615357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.617952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.114451   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.114472   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.114480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.114484   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.116809   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.614454   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.614475   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.614482   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.614487   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.616856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.114549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.114553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.117433   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.615116   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.615137   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.615145   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.615149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.617328   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.617423   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:07.115073   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.115096   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.115109   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.117243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:07.614957   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.614980   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.614988   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.614992   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.617455   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.115203   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.115228   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.115237   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.115242   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.117953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.614601   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.614621   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.614627   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.614632   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.616977   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.115171   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.115192   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.115200   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.115204   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.117505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.117620   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:09.615237   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.615259   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.615266   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.615270   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.617567   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.115157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.115180   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.115188   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.115191   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.117490   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.615247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.615277   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.615280   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.618489   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.115353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.115382   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.118557   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.118654   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:11.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.614439   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.616736   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.114441   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.114467   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.114475   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.114479   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.615359   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.615390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.617471   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.115196   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.115221   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.115230   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.115235   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.117548   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.615239   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.615269   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.615279   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.615285   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.617765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.617868   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:14.115201   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.115222   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.115230   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.118205   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:14.614910   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.614930   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.614941   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.614946   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.617345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.114915   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.114940   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.114953   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.114959   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.117285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.615063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.615091   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.615102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.617892   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:16.114326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.114345   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.114353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.114358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.116687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:16.614425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.614445   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.614456   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.614463   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.115235   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.115266   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.115275   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.115281   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.117592   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.615370   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.615394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.615403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.115421   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.115449   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.115460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.115466   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.117540   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.117666   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:18.615244   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.615280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.615285   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.617069   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:19.115249   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.115272   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.115282   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.115288   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:19.614391   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.614427   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.614435   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.614439   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.616687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:20.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.115251   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.115255   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.119958   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:53:20.120050   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:20.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.614641   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.614651   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.617751   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:21.115329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.115350   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.118322   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:21.615343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.615364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.615373   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.615376   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.617662   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.114307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.114356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.114367   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.114373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.614436   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.614447   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.614452   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.616582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.616699   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:23.115301   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.115323   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.115331   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.115335   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.117744   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:23.614413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.614437   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.616559   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.115103   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.115133   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.115147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.117693   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.614569   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.614577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.614581   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.617065   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.617179   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:25.114461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.114485   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.114493   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.114496   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.116786   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:25.614416   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.614438   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.614446   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.616751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.114388   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.114410   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.114421   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.116745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.614603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.614626   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.617102   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.617207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:27.114783   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.114807   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.114818   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.114826   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.117702   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:27.614374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.614412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.614420   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.614425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.115250   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.115254   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.615342   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.615354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.617775   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.617869   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:29.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.114893   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.114901   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.114907   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:29.615278   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.615300   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.615308   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.615313   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.617690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.117881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.614571   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.614593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.614601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.614605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.617497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.115240   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.115252   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.117580   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.117691   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:31.614491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.614514   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.614520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.614525   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.616752   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.114434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.114457   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.114465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.114469   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.116843   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.614510   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.614531   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.614537   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.614540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.617151   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.114852   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.114859   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.117627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.117746   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:33.615336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.617473   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.114732   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.117561   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.615316   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.615341   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.615351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.615356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.618153   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.114569   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.114593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.114601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.114605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.116953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.614348   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.614383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.614389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.617237   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:36.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.114812   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.117593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:36.615382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.615417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.618040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.114722   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.114753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.114761   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.114765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.116947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.614643   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.614686   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.614697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.614702   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.616876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.114536   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.114566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.114570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.117369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.117462   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:38.615126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.615156   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.615160   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.115081   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.115113   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.115122   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.115126   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.117948   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.614647   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.614659   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.614665   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.617484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.115131   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.115149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.117287   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.615033   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.615059   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.615071   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.617572   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.617676   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:41.115286   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.115309   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.115316   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.115321   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.117762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:41.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.614734   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.614743   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.614747   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.617493   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.115269   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.115292   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.115303   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.117720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.614427   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.614434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.616931   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.115385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.115412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.115425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.118066   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.118207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:43.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.614753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.614765   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.614770   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.617067   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.114374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.114406   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.114415   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.114419   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.116619   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.615405   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.617626   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.115126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.115163   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.117350   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.615112   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.615135   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.615142   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.615147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.617618   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.617714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:46.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:46.615363   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.615386   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.615394   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.615398   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.617675   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.114336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.114357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.114365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.114369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.116450   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.615209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.615232   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.615248   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.617669   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.617889   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:48.114456   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.114479   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.114488   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.114491   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.116715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:48.614390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.614424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.616735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.114828   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.114850   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.114858   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.114863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.117111   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.614976   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.614997   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.615005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.615010   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.617505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.115004   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.115026   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.115033   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.115038   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.117441   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:50.615143   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.615170   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.615179   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.615187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.617427   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.115170   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.115193   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.115205   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.117289   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.615380   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.615419   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.618038   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.114724   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.114760   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.114773   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.117189   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.614887   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.614911   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.614927   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.617222   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.617335   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:53.114967   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.114994   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.115005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.115013   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.117578   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:53.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.614394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.614404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.614412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.617467   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:54.114883   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.114906   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.114915   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.114921   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.117603   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.615330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.615353   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.618101   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.618221   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:55.114614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.114640   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.114649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.114656   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.117436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:55.615236   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.615260   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.615270   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.617974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.114490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.114511   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.114524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.117090   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.614907   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.614932   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.614943   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.614948   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.618676   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:56.618791   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:56.618808   39794 node_ready.go:38] duration metric: took 4m0.004607374s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:53:56.620932   39794 out.go:177] 
	W0717 17:53:56.622268   39794 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0717 17:53:56.622282   39794 out.go:239] * 
	W0717 17:53:56.623241   39794 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:53:56.625101   39794 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	4c2118d2ed18a       6e38f40d628db       3 minutes ago       Running             storage-provisioner       2                   700d9f5e713d3       storage-provisioner
	dd5e8f56c4264       5cc3abe5717db       4 minutes ago       Running             kindnet-cni               1                   dbdf19f96898d       kindnet-5zksq
	b50ede0dde503       cbb01a7bd410d       4 minutes ago       Running             coredns                   1                   4c25cc8ac2148       coredns-7db6d8ff4d-n4xtd
	b27c10fa3251b       8c811b4aec35f       4 minutes ago       Running             busybox                   1                   c15a92e53e40d       busybox-fc5497c4f-5ngfp
	85983f98f84b9       cbb01a7bd410d       4 minutes ago       Running             coredns                   1                   507cc72648f25       coredns-7db6d8ff4d-sh96r
	603ad8840c526       6e38f40d628db       4 minutes ago       Exited              storage-provisioner       1                   700d9f5e713d3       storage-provisioner
	cede48d48fe27       53c535741fb44       4 minutes ago       Running             kube-proxy                1                   1b59105c6df2e       kube-proxy-jlzt5
	7f7ede089f3e7       7820c83aa1394       4 minutes ago       Running             kube-scheduler            1                   903065308cbb5       kube-scheduler-ha-333994
	38a3e6e69ce36       e874818b3caac       4 minutes ago       Running             kube-controller-manager   1                   bfcca696b5273       kube-controller-manager-ha-333994
	3c3e7888bdfe6       56ce0fd9fb532       4 minutes ago       Running             kube-apiserver            1                   2a8a2b0c39cd0       kube-apiserver-ha-333994
	41d1b53347d3e       3861cfcd7c04c       4 minutes ago       Running             etcd                      1                   7982d05a46241       etcd-ha-333994
	529be299dc3b8       38af8ddebf499       4 minutes ago       Running             kube-vip                  0                   fb62346baad47       kube-vip-ha-333994
	db107babf5b82       8c811b4aec35f       26 minutes ago      Exited              busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	dcb6f2bdfe23d       cbb01a7bd410d       27 minutes ago      Exited              coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       27 minutes ago      Exited              coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       27 minutes ago      Exited              kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       27 minutes ago      Exited              kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	d3a0374a88e2c       56ce0fd9fb532       27 minutes ago      Exited              kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       27 minutes ago      Exited              kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       27 minutes ago      Exited              etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       27 minutes ago      Exited              kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.464434972Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.673549472Z" level=info msg="RemoveContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.682188663Z" level=info msg="RemoveContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.314045705Z" level=info msg="RemoveContainer for \"2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.319121815Z" level=info msg="RemoveContainer for \"2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320511033Z" level=info msg="StopPodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320605313Z" level=info msg="TearDown network for sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320616460Z" level=info msg="StopPodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320971991Z" level=info msg="RemovePodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.321016823Z" level=info msg="Forcibly stopping sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.321072160Z" level=info msg="TearDown network for sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.325612741Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.325748048Z" level=info msg="RemovePodSandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326267222Z" level=info msg="StopPodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326463624Z" level=info msg="TearDown network for sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326510323Z" level=info msg="StopPodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326827690Z" level=info msg="RemovePodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326922590Z" level=info msg="Forcibly stopping sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326997124Z" level=info msg="TearDown network for sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.331124459Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.331204383Z" level=info msg="RemovePodSandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" returns successfully"
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.387511700Z" level=info msg="CreateContainer within sandbox \"700d9f5e713d3946ac2752599935acff0c22e7d5b1d38328f08b4514902b10af\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:2,}"
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.414846958Z" level=info msg="CreateContainer within sandbox \"700d9f5e713d3946ac2752599935acff0c22e7d5b1d38328f08b4514902b10af\" for &ContainerMetadata{Name:storage-provisioner,Attempt:2,} returns container id \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\""
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.415806226Z" level=info msg="StartContainer for \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\""
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.483461513Z" level=info msg="StartContainer for \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [85983f98f84b97a11a481548c17b6e998bfec291ea5b38640a0522d82a174e86] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:32930 - 39231 "HINFO IN 1138402013862295929.6773124709558145559. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.011527303s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[649992777]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.508) (total time: 30004ms):
	Trace[649992777]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (17:50:20.513)
	Trace[649992777]: [30.004346914s] [30.004346914s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[119638294]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.509) (total time: 30004ms):
	Trace[119638294]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (17:50:20.512)
	Trace[119638294]: [30.004435266s] [30.004435266s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1087831118]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.513) (total time: 30001ms):
	Trace[1087831118]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:50:20.514)
	Trace[1087831118]: [30.001558122s] [30.001558122s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [b50ede0dde50338ef9fddc834d572f0d265fdc75b3a6e0ffab0b3a090f0cfac9] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:35715 - 11457 "HINFO IN 3013652693694148412.8082718229865211359. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.009035708s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1696274823]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.643) (total time: 30002ms):
	Trace[1696274823]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:50:20.645)
	Trace[1696274823]: [30.002410627s] [30.002410627s] END
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[990945787]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.645) (total time: 30001ms):
	Trace[990945787]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:50:20.645)
	Trace[990945787]: [30.00126887s] [30.00126887s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1760112988]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.646) (total time: 30000ms):
	Trace[1760112988]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:50:20.646)
	Trace[1760112988]: [30.000893639s] [30.000893639s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:54:02 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    4c5a3bea-29ed-4c23-a2f3-16d92a2e967b
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     27m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     27m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         27m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      27m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m19s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 27m                    kube-proxy       
	  Normal  Starting                 4m18s                  kube-proxy       
	  Normal  Starting                 27m                    kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  27m (x4 over 27m)      kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    27m (x4 over 27m)      kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     27m (x3 over 27m)      kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  27m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientPID     27m                    kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeHasSufficientMemory  27m                    kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    27m                    kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeAllocatableEnforced  27m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 27m                    kubelet          Starting kubelet.
	  Normal  RegisteredNode           27m                    node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                27m                    kubelet          Node ha-333994 status is now: NodeReady
	  Normal  Starting                 4m35s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  4m35s (x8 over 4m35s)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m35s (x8 over 4m35s)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m35s (x7 over 4m35s)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  4m35s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           4m13s                  node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	
	
	==> dmesg <==
	[Jul17 17:49] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050055] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040308] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.524310] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.354966] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +1.596488] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +7.929260] systemd-fstab-generator[758]: Ignoring "noauto" option for root device
	[  +0.058074] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.064860] systemd-fstab-generator[770]: Ignoring "noauto" option for root device
	[  +0.158074] systemd-fstab-generator[784]: Ignoring "noauto" option for root device
	[  +0.141409] systemd-fstab-generator[796]: Ignoring "noauto" option for root device
	[  +0.316481] systemd-fstab-generator[830]: Ignoring "noauto" option for root device
	[  +1.413303] systemd-fstab-generator[905]: Ignoring "noauto" option for root device
	[  +6.936615] kauditd_printk_skb: 197 callbacks suppressed
	[  +9.904333] kauditd_printk_skb: 40 callbacks suppressed
	[  +6.090710] kauditd_printk_skb: 81 callbacks suppressed
	
	
	==> etcd [41d1b53347d3ec95c0752a7b8006e52252561ffd6b0613e71f4c4d1a66d84cd1] <==
	{"level":"info","ts":"2024-07-17T17:49:40.746451Z","caller":"fileutil/purge.go:50","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/snap","suffix":"snap","max":5,"interval":"30s"}
	{"level":"info","ts":"2024-07-17T17:49:40.746545Z","caller":"fileutil/purge.go:50","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2024-07-17T17:49:40.747109Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 switched to configuration voters=(808613133158692504)"}
	{"level":"info","ts":"2024-07-17T17:49:40.74735Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","added-peer-id":"b38c55c42a3b698","added-peer-peer-urls":["https://192.168.39.180:2380"]}
	{"level":"info","ts":"2024-07-17T17:49:40.747698Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:49:40.747826Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:49:40.768847Z","caller":"embed/etcd.go:726","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2024-07-17T17:49:40.769611Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"b38c55c42a3b698","initial-advertise-peer-urls":["https://192.168.39.180:2380"],"listen-peer-urls":["https://192.168.39.180:2380"],"advertise-client-urls":["https://192.168.39.180:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.39.180:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-07-17T17:49:40.771975Z","caller":"embed/etcd.go:857","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-07-17T17:49:40.783644Z","caller":"embed/etcd.go:597","msg":"serving peer traffic","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:49:40.784432Z","caller":"embed/etcd.go:569","msg":"cmux::serve","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:49:42.218092Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.218153Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.21819Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.218203Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218304Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218487Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218517Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.221374Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:49:42.221719Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:49:42.224325Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:49:42.224772Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:49:42.240735Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:49:42.240792Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:49:42.251537Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:41:11.077099Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":1506}
	{"level":"info","ts":"2024-07-17T17:41:11.08271Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":1506,"took":"4.803656ms","hash":4135639207,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2002944,"current-db-size-in-use":"2.0 MB"}
	{"level":"info","ts":"2024-07-17T17:41:11.082934Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":4135639207,"revision":1506,"compact-revision":967}
	{"level":"info","ts":"2024-07-17T17:46:11.088545Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":2115}
	{"level":"info","ts":"2024-07-17T17:46:11.093763Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":2115,"took":"4.690419ms","hash":3040853481,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2105344,"current-db-size-in-use":"2.1 MB"}
	{"level":"info","ts":"2024-07-17T17:46:11.093935Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":3040853481,"revision":2115,"compact-revision":1506}
	
	
	==> kernel <==
	 17:54:08 up 4 min,  0 users,  load average: 0.05, 0.12, 0.06
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [dd5e8f56c4264ac3ce97606579dbb45bd1defa712cc5dfd7ef8601f279e53896] <==
	I0717 17:53:01.817667       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:11.814577       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:11.814652       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:11.814805       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:11.814813       1 main.go:303] handling current node
	I0717 17:53:21.810119       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:21.810198       1 main.go:303] handling current node
	I0717 17:53:21.810249       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:21.810274       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:31.817062       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:31.817224       1 main.go:303] handling current node
	I0717 17:53:31.817323       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:31.817345       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:41.816987       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:41.817045       1 main.go:303] handling current node
	I0717 17:53:41.817065       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:41.817072       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:51.809287       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:51.809360       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:51.810037       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:51.810079       1 main.go:303] handling current node
	I0717 17:54:01.809782       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:54:01.809814       1 main.go:303] handling current node
	I0717 17:54:01.809828       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:54:01.809833       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:46:36.593294       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:46:46.594446       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:46:46.594495       1 main.go:303] handling current node
	I0717 17:46:46.594508       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:46:46.594516       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:46:56.593210       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:46:56.593351       1 main.go:303] handling current node
	I0717 17:46:56.593473       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:46:56.593496       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:06.593427       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:06.593567       1 main.go:303] handling current node
	I0717 17:47:06.593587       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:06.593593       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:16.603181       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:16.603262       1 main.go:303] handling current node
	I0717 17:47:16.603286       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:16.603292       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:26.593294       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:26.593479       1 main.go:303] handling current node
	I0717 17:47:26.593751       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:26.593932       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:36.593175       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:36.593213       1 main.go:303] handling current node
	I0717 17:47:36.593235       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:36.593240       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [3c3e7888bdfe65eb452a8b1911680c8ed68a5d49a41528c6544c9bdbad54463d] <==
	I0717 17:49:43.595082       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I0717 17:49:43.595111       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I0717 17:49:43.595140       1 crd_finalizer.go:266] Starting CRDFinalizer
	I0717 17:49:43.597000       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0717 17:49:43.597114       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0717 17:49:43.641418       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0717 17:49:43.648238       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0717 17:49:43.648665       1 policy_source.go:224] refreshing policies
	I0717 17:49:43.659841       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:49:43.676754       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0717 17:49:43.677085       1 shared_informer.go:320] Caches are synced for configmaps
	I0717 17:49:43.679683       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0717 17:49:43.679810       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0717 17:49:43.682669       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0717 17:49:43.686464       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0717 17:49:43.688086       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	E0717 17:49:43.689041       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0717 17:49:43.691390       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0717 17:49:43.692086       1 aggregator.go:165] initial CRD sync complete...
	I0717 17:49:43.692210       1 autoregister_controller.go:141] Starting autoregister controller
	I0717 17:49:43.692231       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0717 17:49:43.692323       1 cache.go:39] Caches are synced for autoregister controller
	I0717 17:49:44.589738       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:49:55.907406       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:49:56.140322       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [38a3e6e69ce36e4718f7597a891505e74d497b2ce82217fdebe3363666ea32f6] <==
	I0717 17:49:55.953819       1 shared_informer.go:320] Caches are synced for stateful set
	I0717 17:49:55.969497       1 shared_informer.go:320] Caches are synced for disruption
	I0717 17:49:55.969720       1 shared_informer.go:320] Caches are synced for daemon sets
	I0717 17:49:55.989955       1 shared_informer.go:320] Caches are synced for crt configmap
	I0717 17:49:55.995325       1 shared_informer.go:320] Caches are synced for taint
	I0717 17:49:55.995861       1 node_lifecycle_controller.go:1227] "Initializing eviction metric for zone" logger="node-lifecycle-controller" zone=""
	I0717 17:49:56.008684       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994"
	I0717 17:49:56.009020       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:49:56.009215       1 node_lifecycle_controller.go:1073] "Controller detected that zone is now in new state" logger="node-lifecycle-controller" zone="" newState="Normal"
	I0717 17:49:56.107028       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0717 17:49:56.125129       1 shared_informer.go:320] Caches are synced for HPA
	I0717 17:49:56.130984       1 shared_informer.go:320] Caches are synced for endpoint
	I0717 17:49:56.150989       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:49:56.160240       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:49:56.545417       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:49:56.545744       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0717 17:49:56.607585       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:50:29.652302       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="17.989423ms"
	I0717 17:50:29.652927       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="154.343µs"
	I0717 17:50:29.673006       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="10.432657ms"
	I0717 17:50:29.674427       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="35.074µs"
	I0717 17:54:00.096330       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="23.157048ms"
	I0717 17:54:00.103117       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.620259ms"
	I0717 17:54:00.103395       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="156.252µs"
	I0717 17:54:00.105615       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="47.541µs"
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	I0717 17:46:44.300951       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.533645ms"
	I0717 17:46:44.302036       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="47.71µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-proxy [cede48d48fe274c1e899c0bd8bea598571a7def0a52e5e2bade595ef4f553fef] <==
	I0717 17:49:50.697431       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:49:50.728033       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:49:50.773252       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:49:50.773306       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:49:50.773323       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:49:50.776016       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:49:50.776460       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:49:50.776490       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:49:50.778529       1 config.go:192] "Starting service config controller"
	I0717 17:49:50.778847       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:49:50.778963       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:49:50.779098       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:49:50.780341       1 config.go:319] "Starting node config controller"
	I0717 17:49:50.780372       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:49:50.880389       1 shared_informer.go:320] Caches are synced for service config
	I0717 17:49:50.880465       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:49:50.880915       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kube-scheduler [7f7ede089f3e73228764b3c542d044e8dfb371908879f2d014d0b3cb56b61a60] <==
	I0717 17:49:41.818392       1 serving.go:380] Generated self-signed cert in-memory
	I0717 17:49:43.698181       1 server.go:154] "Starting Kubernetes Scheduler" version="v1.30.2"
	I0717 17:49:43.698222       1 server.go:156] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:49:43.704731       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0717 17:49:43.704960       1 requestheader_controller.go:169] Starting RequestHeaderAuthRequestController
	I0717 17:49:43.705003       1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController
	I0717 17:49:43.705055       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0717 17:49:43.708667       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0717 17:49:43.708702       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0717 17:49:43.708715       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file"
	I0717 17:49:43.708721       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
	I0717 17:49:43.805438       1 shared_informer.go:320] Caches are synced for RequestHeaderAuthRequestController
	I0717 17:49:43.809697       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
	I0717 17:49:43.809823       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:50:20 ha-333994 kubelet[912]: I0717 17:50:20.667533     912 scope.go:117] "RemoveContainer" containerID="86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	Jul 17 17:50:20 ha-333994 kubelet[912]: I0717 17:50:20.668345     912 scope.go:117] "RemoveContainer" containerID="603ad8840c52684184d18957755dbefa293c0f1b45c847cd88296b580d9ac18f"
	Jul 17 17:50:20 ha-333994 kubelet[912]: E0717 17:50:20.668770     912 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(123c311b-67ed-42b2-ad53-cc59077dfbe7)\"" pod="kube-system/storage-provisioner" podUID="123c311b-67ed-42b2-ad53-cc59077dfbe7"
	Jul 17 17:50:33 ha-333994 kubelet[912]: I0717 17:50:33.312537     912 scope.go:117] "RemoveContainer" containerID="2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	Jul 17 17:50:33 ha-333994 kubelet[912]: E0717 17:50:33.409447     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:50:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:50:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:50:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:50:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:50:36 ha-333994 kubelet[912]: I0717 17:50:36.384656     912 scope.go:117] "RemoveContainer" containerID="603ad8840c52684184d18957755dbefa293c0f1b45c847cd88296b580d9ac18f"
	Jul 17 17:51:33 ha-333994 kubelet[912]: E0717 17:51:33.410923     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:51:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:51:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:51:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:51:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:52:33 ha-333994 kubelet[912]: E0717 17:52:33.411201     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:52:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:52:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:52:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:52:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:53:33 ha-333994 kubelet[912]: E0717 17:53:33.409498     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:53:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:53:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:53:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:53:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6 busybox-fc5497c4f-gtghn
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/DeleteSecondaryNode]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6 busybox-fc5497c4f-gtghn
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6 busybox-fc5497c4f-gtghn:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  4m26s                default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.
	  Warning  FailedScheduling  4m20s                default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.
	  Warning  FailedScheduling  16m (x3 over 26m)    default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  8m24s (x3 over 13m)  default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.
	
	
	Name:             busybox-fc5497c4f-gtghn
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-lfmtp (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-lfmtp:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age   From               Message
	  ----     ------            ----  ----               -------
	  Warning  FailedScheduling  9s    default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) were unschedulable. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/DeleteSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/DeleteSecondaryNode (10.53s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (2.91s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:390: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
ha_test.go:413: expected profile "ha-333994" in json of 'profile list' to have "Degraded" status but have "Stopped" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-333994\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-333994\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"kvm2\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACount\":
1,\"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-333994\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.168.39.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"containerd\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.168.39.180\",\"Port\":8443,
\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.168.39.127\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"containerd\",\"ControlPlane\":true,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":false,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":fals
e,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/home/jenkins:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\
"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-linux-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
helpers_test.go:244: <<< TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-333994 logs -n 25: (1.64996925s)
helpers_test.go:252: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete logs: 
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                 Args                 |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].status.podIP}'  |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.io               |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 --           |           |         |         |                     |                     |
	|         | nslookup kubernetes.default          |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6 -- nslookup  |           |         |         |                     |                     |
	|         | kubernetes.default.svc.cluster.local |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- get pods -o          | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | jsonpath='{.items[*].metadata.name}' |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:38 UTC |
	|         | busybox-fc5497c4f-5ngfp -- sh        |           |         |         |                     |                     |
	|         | -c ping -c 1 192.168.39.1            |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-74lsp              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| kubectl | -p ha-333994 -- exec                 | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC |                     |
	|         | busybox-fc5497c4f-djvz6              |           |         |         |                     |                     |
	|         | -- sh -c nslookup                    |           |         |         |                     |                     |
	|         | host.minikube.internal | awk         |           |         |         |                     |                     |
	|         | 'NR==5' | cut -d' ' -f3              |           |         |         |                     |                     |
	| node    | add -p ha-333994 -v=7                | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:38 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node stop m02 -v=7         | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC | 17 Jul 24 17:40 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | ha-333994 node start m02 -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:40 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-333994 -v=7               | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:46 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| stop    | -p ha-333994 -v=7                    | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:46 UTC | 17 Jul 24 17:49 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| start   | -p ha-333994 --wait=true -v=7        | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:49 UTC |                     |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	| node    | list -p ha-333994                    | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:53 UTC |                     |
	| node    | ha-333994 node delete m03 -v=7       | ha-333994 | jenkins | v1.33.1 | 17 Jul 24 17:53 UTC | 17 Jul 24 17:54 UTC |
	|         | --alsologtostderr                    |           |         |         |                     |                     |
	|---------|--------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:49:11
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:49:11.274843   39794 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:49:11.274995   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275005   39794 out.go:304] Setting ErrFile to fd 2...
	I0717 17:49:11.275011   39794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:49:11.275192   39794 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:49:11.275748   39794 out.go:298] Setting JSON to false
	I0717 17:49:11.276624   39794 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":5494,"bootTime":1721233057,"procs":178,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:49:11.276685   39794 start.go:139] virtualization: kvm guest
	I0717 17:49:11.279428   39794 out.go:177] * [ha-333994] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:49:11.280920   39794 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:49:11.280939   39794 notify.go:220] Checking for updates...
	I0717 17:49:11.284081   39794 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:49:11.285572   39794 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:11.286973   39794 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:49:11.288259   39794 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:49:11.289617   39794 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:49:11.291360   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:11.291471   39794 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:49:11.291860   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.291910   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.306389   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41441
	I0717 17:49:11.306830   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.307340   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.307365   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.307652   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.307877   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.342518   39794 out.go:177] * Using the kvm2 driver based on existing profile
	I0717 17:49:11.343905   39794 start.go:297] selected driver: kvm2
	I0717 17:49:11.343922   39794 start.go:901] validating driver "kvm2" against &{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false
ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountU
ID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.344074   39794 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:49:11.344385   39794 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.344460   39794 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:49:11.359473   39794 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:49:11.360126   39794 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0717 17:49:11.360191   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:11.360203   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:11.360258   39794 start.go:340] cluster config:
	{Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false i
stio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:fa
lse CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:11.360356   39794 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:49:11.362215   39794 out.go:177] * Starting "ha-333994" primary control-plane node in "ha-333994" cluster
	I0717 17:49:11.363497   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:11.363528   39794 preload.go:146] Found local preload: /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:49:11.363538   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:11.363621   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:11.363633   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:11.363751   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:11.363927   39794 start.go:360] acquireMachinesLock for ha-333994: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:11.363968   39794 start.go:364] duration metric: took 23.038µs to acquireMachinesLock for "ha-333994"
	I0717 17:49:11.363985   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:11.363995   39794 fix.go:54] fixHost starting: 
	I0717 17:49:11.364238   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:11.364269   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:11.378515   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45003
	I0717 17:49:11.378994   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:11.379458   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:11.379478   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:11.379772   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:11.379977   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:11.380153   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:11.381889   39794 fix.go:112] recreateIfNeeded on ha-333994: state=Stopped err=<nil>
	I0717 17:49:11.381920   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	W0717 17:49:11.382061   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:11.384353   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994" ...
	I0717 17:49:11.386332   39794 main.go:141] libmachine: (ha-333994) Calling .Start
	I0717 17:49:11.386525   39794 main.go:141] libmachine: (ha-333994) Ensuring networks are active...
	I0717 17:49:11.387295   39794 main.go:141] libmachine: (ha-333994) Ensuring network default is active
	I0717 17:49:11.387605   39794 main.go:141] libmachine: (ha-333994) Ensuring network mk-ha-333994 is active
	I0717 17:49:11.387902   39794 main.go:141] libmachine: (ha-333994) Getting domain xml...
	I0717 17:49:11.388700   39794 main.go:141] libmachine: (ha-333994) Creating domain...
	I0717 17:49:12.581316   39794 main.go:141] libmachine: (ha-333994) Waiting to get IP...
	I0717 17:49:12.582199   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.582613   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.582685   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.582591   39823 retry.go:31] will retry after 292.960023ms: waiting for machine to come up
	I0717 17:49:12.877268   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:12.877833   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:12.877861   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:12.877756   39823 retry.go:31] will retry after 283.500887ms: waiting for machine to come up
	I0717 17:49:13.163417   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.163805   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.163826   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.163761   39823 retry.go:31] will retry after 385.368306ms: waiting for machine to come up
	I0717 17:49:13.550406   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:13.550840   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:13.550897   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:13.550822   39823 retry.go:31] will retry after 528.571293ms: waiting for machine to come up
	I0717 17:49:14.080602   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.081093   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.081118   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.081048   39823 retry.go:31] will retry after 736.772802ms: waiting for machine to come up
	I0717 17:49:14.818924   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:14.819326   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:14.819347   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:14.819281   39823 retry.go:31] will retry after 776.986347ms: waiting for machine to come up
	I0717 17:49:15.598237   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:15.598607   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:15.598627   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:15.598573   39823 retry.go:31] will retry after 1.036578969s: waiting for machine to come up
	I0717 17:49:16.637046   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:16.637440   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:16.637463   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:16.637404   39823 retry.go:31] will retry after 1.055320187s: waiting for machine to come up
	I0717 17:49:17.694838   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:17.695248   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:17.695273   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:17.695211   39823 retry.go:31] will retry after 1.335817707s: waiting for machine to come up
	I0717 17:49:19.032835   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:19.033306   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:19.033330   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:19.033266   39823 retry.go:31] will retry after 1.730964136s: waiting for machine to come up
	I0717 17:49:20.766254   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:20.766740   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:20.766768   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:20.766694   39823 retry.go:31] will retry after 2.796619276s: waiting for machine to come up
	I0717 17:49:23.566195   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:23.566759   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:23.566784   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:23.566716   39823 retry.go:31] will retry after 3.008483388s: waiting for machine to come up
	I0717 17:49:26.576866   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:26.577295   39794 main.go:141] libmachine: (ha-333994) DBG | unable to find current IP address of domain ha-333994 in network mk-ha-333994
	I0717 17:49:26.577318   39794 main.go:141] libmachine: (ha-333994) DBG | I0717 17:49:26.577242   39823 retry.go:31] will retry after 2.889284576s: waiting for machine to come up
	I0717 17:49:29.467942   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468316   39794 main.go:141] libmachine: (ha-333994) Found IP for machine: 192.168.39.180
	I0717 17:49:29.468337   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has current primary IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.468346   39794 main.go:141] libmachine: (ha-333994) Reserving static IP address...
	I0717 17:49:29.468737   39794 main.go:141] libmachine: (ha-333994) Reserved static IP address: 192.168.39.180
	I0717 17:49:29.468757   39794 main.go:141] libmachine: (ha-333994) Waiting for SSH to be available...
	I0717 17:49:29.468777   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.468804   39794 main.go:141] libmachine: (ha-333994) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994", mac: "52:54:00:73:4b:68", ip: "192.168.39.180"}
	I0717 17:49:29.468820   39794 main.go:141] libmachine: (ha-333994) DBG | Getting to WaitForSSH function...
	I0717 17:49:29.470695   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471026   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.471058   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.471199   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH client type: external
	I0717 17:49:29.471226   39794 main.go:141] libmachine: (ha-333994) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa (-rw-------)
	I0717 17:49:29.471255   39794 main.go:141] libmachine: (ha-333994) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.180 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:29.471268   39794 main.go:141] libmachine: (ha-333994) DBG | About to run SSH command:
	I0717 17:49:29.471282   39794 main.go:141] libmachine: (ha-333994) DBG | exit 0
	I0717 17:49:29.598374   39794 main.go:141] libmachine: (ha-333994) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:29.598754   39794 main.go:141] libmachine: (ha-333994) Calling .GetConfigRaw
	I0717 17:49:29.599414   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.601913   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602312   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.602351   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.602634   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:29.602858   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:29.602888   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:29.603106   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.605092   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605423   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.605446   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.605613   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.605754   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.605900   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.606023   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.606203   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.606385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.606396   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:29.714755   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:29.714801   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715040   39794 buildroot.go:166] provisioning hostname "ha-333994"
	I0717 17:49:29.715065   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.715237   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.717642   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.717930   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.717959   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.718110   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.718285   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718413   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.718528   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.718679   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.718838   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.718848   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994 && echo "ha-333994" | sudo tee /etc/hostname
	I0717 17:49:29.840069   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994
	
	I0717 17:49:29.840100   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.842822   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843208   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.843233   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.843392   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:29.843581   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843706   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:29.843878   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:29.844054   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:29.844256   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:29.844272   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:29.959423   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:29.959450   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:29.959474   39794 buildroot.go:174] setting up certificates
	I0717 17:49:29.959488   39794 provision.go:84] configureAuth start
	I0717 17:49:29.959495   39794 main.go:141] libmachine: (ha-333994) Calling .GetMachineName
	I0717 17:49:29.959790   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:29.962162   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962537   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.962563   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.962700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:29.964777   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965084   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:29.965116   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:29.965226   39794 provision.go:143] copyHostCerts
	I0717 17:49:29.965266   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965305   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:29.965317   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:29.965397   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:29.965507   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965534   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:29.965544   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:29.965581   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:29.965639   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965671   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:29.965680   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:29.965714   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:29.965774   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994 san=[127.0.0.1 192.168.39.180 ha-333994 localhost minikube]
	I0717 17:49:30.057325   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:30.057377   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:30.057400   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.059825   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060114   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.060140   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.060281   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.060451   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.060561   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.060675   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.146227   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:30.146289   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:30.174390   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:30.174450   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:30.202477   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:30.202541   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0717 17:49:30.229907   39794 provision.go:87] duration metric: took 270.408982ms to configureAuth
	I0717 17:49:30.229929   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:30.230164   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:30.230177   39794 machine.go:97] duration metric: took 627.307249ms to provisionDockerMachine
	I0717 17:49:30.230186   39794 start.go:293] postStartSetup for "ha-333994" (driver="kvm2")
	I0717 17:49:30.230200   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:30.230227   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.230520   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:30.230554   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.233026   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233363   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.233390   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.233521   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.233700   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.233828   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.233952   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.318669   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:30.323112   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:30.323131   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:30.323180   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:30.323246   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:30.323258   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:30.323348   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:30.334564   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:30.360407   39794 start.go:296] duration metric: took 130.206138ms for postStartSetup
	I0717 17:49:30.360441   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.360727   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:30.360774   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.362968   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363308   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.363334   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.363435   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.363609   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.363749   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.363862   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.448825   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:30.448901   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:30.490930   39794 fix.go:56] duration metric: took 19.126931057s for fixHost
	I0717 17:49:30.490966   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.493716   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494056   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.494081   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.494261   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.494473   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494636   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.494816   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.495007   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:30.495221   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.180 22 <nil> <nil>}
	I0717 17:49:30.495236   39794 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:49:30.611220   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238570.579395854
	
	I0717 17:49:30.611243   39794 fix.go:216] guest clock: 1721238570.579395854
	I0717 17:49:30.611255   39794 fix.go:229] Guest: 2024-07-17 17:49:30.579395854 +0000 UTC Remote: 2024-07-17 17:49:30.49095133 +0000 UTC m=+19.250883626 (delta=88.444524ms)
	I0717 17:49:30.611271   39794 fix.go:200] guest clock delta is within tolerance: 88.444524ms
	I0717 17:49:30.611277   39794 start.go:83] releasing machines lock for "ha-333994", held for 19.24729888s
	I0717 17:49:30.611293   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.611569   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:30.613990   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614318   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.614355   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.614483   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.614909   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615067   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:30.615169   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:30.615215   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.615255   39794 ssh_runner.go:195] Run: cat /version.json
	I0717 17:49:30.615275   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:30.617353   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617676   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.617702   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617734   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.617863   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618049   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618146   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:30.618173   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:30.618217   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618306   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:30.618370   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.618445   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:30.618555   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:30.618672   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:30.694919   39794 ssh_runner.go:195] Run: systemctl --version
	I0717 17:49:30.721823   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0717 17:49:30.727892   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:30.727967   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:30.745249   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:30.745272   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:30.745332   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:30.784101   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:30.798192   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:30.798265   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:30.811458   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:30.824815   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:30.938731   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:31.081893   39794 docker.go:233] disabling docker service ...
	I0717 17:49:31.081980   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:31.097028   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:31.110328   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:31.242915   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:31.365050   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:31.379135   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:31.400136   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:31.412561   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:31.425082   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:31.425159   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:31.437830   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.450453   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:31.462175   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:31.473289   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:31.484541   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:31.495502   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:31.506265   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:31.518840   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:31.530158   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:31.530208   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:31.548502   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:31.563431   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:31.674043   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:31.701907   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:31.702006   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:31.706668   39794 retry.go:31] will retry after 920.793788ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:32.627794   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:32.632953   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:32.633009   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:32.636846   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:32.677947   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:32.678013   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.709490   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:32.738106   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:32.739529   39794 main.go:141] libmachine: (ha-333994) Calling .GetIP
	I0717 17:49:32.742040   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742375   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:32.742405   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:32.742590   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:32.746706   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.759433   39794 kubeadm.go:883] updating cluster {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Cl
usterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingre
ss:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:do
cker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0717 17:49:32.759609   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:32.759661   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.792410   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.792432   39794 containerd.go:534] Images already preloaded, skipping extraction
	I0717 17:49:32.792483   39794 ssh_runner.go:195] Run: sudo crictl images --output json
	I0717 17:49:32.824536   39794 containerd.go:627] all images are preloaded for containerd runtime.
	I0717 17:49:32.824558   39794 cache_images.go:84] Images are preloaded, skipping loading
	I0717 17:49:32.824565   39794 kubeadm.go:934] updating node { 192.168.39.180 8443 v1.30.2 containerd true true} ...
	I0717 17:49:32.824675   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.180
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:32.824722   39794 ssh_runner.go:195] Run: sudo crictl info
	I0717 17:49:32.856864   39794 cni.go:84] Creating CNI manager for ""
	I0717 17:49:32.856886   39794 cni.go:136] multinode detected (3 nodes found), recommending kindnet
	I0717 17:49:32.856893   39794 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0717 17:49:32.856917   39794 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.180 APIServerPort:8443 KubernetesVersion:v1.30.2 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-333994 NodeName:ha-333994 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.180"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.180 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0717 17:49:32.857032   39794 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.180
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-333994"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.180
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.180"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.30.2
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0717 17:49:32.857054   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:32.857090   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:32.875326   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:32.875456   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:32.875511   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:32.885386   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:32.885459   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0717 17:49:32.895011   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0717 17:49:32.913107   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:32.929923   39794 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0717 17:49:32.946336   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:32.962757   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:32.966796   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:32.979550   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:33.092357   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:33.111897   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.180
	I0717 17:49:33.111921   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:33.111940   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.112113   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:33.112206   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:33.112225   39794 certs.go:256] generating profile certs ...
	I0717 17:49:33.112347   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:33.112383   39794 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1
	I0717 17:49:33.112401   39794 crypto.go:68] Generating cert /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.180 192.168.39.127 192.168.39.254]
	I0717 17:49:33.337392   39794 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 ...
	I0717 17:49:33.337432   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1: {Name:mkfeb2a5adc7d732ca48854394be4077f3b9b81e Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337612   39794 crypto.go:164] Writing key to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 ...
	I0717 17:49:33.337630   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1: {Name:mk17811291d2c587100f8fbd5f0c9c2d641ddf76 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.337728   39794 certs.go:381] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt
	I0717 17:49:33.337924   39794 certs.go:385] copying /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.ac7db6e1 -> /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key
	I0717 17:49:33.338098   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:33.338134   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:33.338154   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:33.338172   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:33.338188   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:33.338203   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:33.338221   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:33.338239   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:33.338253   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:33.338313   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:33.338354   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:33.338363   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:33.338391   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:33.338431   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:33.338457   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:33.338511   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:33.338549   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.338570   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.338587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.339107   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:33.371116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:33.405873   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:33.442007   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:33.472442   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:33.496116   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:33.527403   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:33.552684   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:33.576430   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:33.599936   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:33.623341   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:33.646635   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0717 17:49:33.663325   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:33.668872   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:33.679471   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683810   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.683866   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:33.689677   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:33.700471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:33.710911   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715522   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.715581   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:33.721331   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:33.731730   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:33.742074   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746374   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.746417   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:33.751941   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:33.762070   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:33.766344   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0717 17:49:33.771976   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0717 17:49:33.777506   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0717 17:49:33.783203   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0717 17:49:33.788713   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0717 17:49:33.794346   39794 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0717 17:49:33.800031   39794 kubeadm.go:392] StartCluster: {Name:ha-333994 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 Clust
erName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.197 Port:0 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:
false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docke
r BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:49:33.800131   39794 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0717 17:49:33.800172   39794 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0717 17:49:33.836926   39794 cri.go:89] found id: "86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	I0717 17:49:33.836947   39794 cri.go:89] found id: "dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f"
	I0717 17:49:33.836952   39794 cri.go:89] found id: "5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a"
	I0717 17:49:33.836956   39794 cri.go:89] found id: "f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428"
	I0717 17:49:33.836959   39794 cri.go:89] found id: "0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45"
	I0717 17:49:33.836963   39794 cri.go:89] found id: "2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	I0717 17:49:33.836967   39794 cri.go:89] found id: "d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411"
	I0717 17:49:33.836970   39794 cri.go:89] found id: "2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c"
	I0717 17:49:33.836974   39794 cri.go:89] found id: "5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46"
	I0717 17:49:33.836981   39794 cri.go:89] found id: "515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697"
	I0717 17:49:33.836985   39794 cri.go:89] found id: ""
	I0717 17:49:33.837036   39794 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0717 17:49:33.850888   39794 cri.go:116] JSON = null
	W0717 17:49:33.850933   39794 kubeadm.go:399] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0717 17:49:33.851001   39794 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0717 17:49:33.861146   39794 kubeadm.go:408] found existing configuration files, will attempt cluster restart
	I0717 17:49:33.861164   39794 kubeadm.go:593] restartPrimaryControlPlane start ...
	I0717 17:49:33.861204   39794 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0717 17:49:33.870180   39794 kubeadm.go:130] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0717 17:49:33.870557   39794 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-333994" does not appear in /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.870654   39794 kubeconfig.go:62] /home/jenkins/minikube-integration/19283-14409/kubeconfig needs updating (will repair): [kubeconfig missing "ha-333994" cluster setting kubeconfig missing "ha-333994" context setting]
	I0717 17:49:33.870894   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.871258   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.871471   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.180:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0717 17:49:33.871875   39794 cert_rotation.go:137] Starting client certificate rotation controller
	I0717 17:49:33.872033   39794 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0717 17:49:33.881089   39794 kubeadm.go:630] The running cluster does not require reconfiguration: 192.168.39.180
	I0717 17:49:33.881107   39794 kubeadm.go:597] duration metric: took 19.938705ms to restartPrimaryControlPlane
	I0717 17:49:33.881113   39794 kubeadm.go:394] duration metric: took 81.089134ms to StartCluster
	I0717 17:49:33.881124   39794 settings.go:142] acquiring lock: {Name:mk91c7387a23a84a0d90c1f4a8be889afd5f8e36 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881175   39794 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:33.881658   39794 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/kubeconfig: {Name:mkcf3eba146eb28d296552e24aa3055bdbdcc231 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:33.881845   39794 start.go:233] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.180 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:33.881872   39794 start.go:241] waiting for startup goroutines ...
	I0717 17:49:33.881879   39794 addons.go:507] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volcano:false volumesnapshots:false yakd:false]
	I0717 17:49:33.882084   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.884129   39794 out.go:177] * Enabled addons: 
	I0717 17:49:33.885737   39794 addons.go:510] duration metric: took 3.853682ms for enable addons: enabled=[]
	I0717 17:49:33.885760   39794 start.go:246] waiting for cluster config update ...
	I0717 17:49:33.885767   39794 start.go:255] writing updated cluster config ...
	I0717 17:49:33.887338   39794 out.go:177] 
	I0717 17:49:33.888767   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:33.888845   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.890338   39794 out.go:177] * Starting "ha-333994-m02" control-plane node in "ha-333994" cluster
	I0717 17:49:33.891461   39794 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:49:33.891475   39794 cache.go:56] Caching tarball of preloaded images
	I0717 17:49:33.891543   39794 preload.go:172] Found /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0717 17:49:33.891554   39794 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:49:33.891626   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:33.891771   39794 start.go:360] acquireMachinesLock for ha-333994-m02: {Name:mk0f74b853b0d6e269bf0c6a25c6edeb4f1994c0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0717 17:49:33.891806   39794 start.go:364] duration metric: took 19.128µs to acquireMachinesLock for "ha-333994-m02"
	I0717 17:49:33.891819   39794 start.go:96] Skipping create...Using existing machine configuration
	I0717 17:49:33.891826   39794 fix.go:54] fixHost starting: m02
	I0717 17:49:33.892056   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:33.892076   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:33.906264   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44047
	I0717 17:49:33.906599   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:33.907064   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:33.907083   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:33.907400   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:33.907566   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:33.907713   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:49:33.909180   39794 fix.go:112] recreateIfNeeded on ha-333994-m02: state=Stopped err=<nil>
	I0717 17:49:33.909199   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	W0717 17:49:33.909338   39794 fix.go:138] unexpected machine state, will restart: <nil>
	I0717 17:49:33.911077   39794 out.go:177] * Restarting existing kvm2 VM for "ha-333994-m02" ...
	I0717 17:49:33.912122   39794 main.go:141] libmachine: (ha-333994-m02) Calling .Start
	I0717 17:49:33.912246   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring networks are active...
	I0717 17:49:33.912879   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network default is active
	I0717 17:49:33.913156   39794 main.go:141] libmachine: (ha-333994-m02) Ensuring network mk-ha-333994 is active
	I0717 17:49:33.913539   39794 main.go:141] libmachine: (ha-333994-m02) Getting domain xml...
	I0717 17:49:33.914190   39794 main.go:141] libmachine: (ha-333994-m02) Creating domain...
	I0717 17:49:35.092192   39794 main.go:141] libmachine: (ha-333994-m02) Waiting to get IP...
	I0717 17:49:35.092951   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.093269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.093360   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.093273   39957 retry.go:31] will retry after 192.383731ms: waiting for machine to come up
	I0717 17:49:35.287679   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.288078   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.288104   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.288046   39957 retry.go:31] will retry after 385.654698ms: waiting for machine to come up
	I0717 17:49:35.675666   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:35.676036   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:35.676064   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:35.675991   39957 retry.go:31] will retry after 420.16772ms: waiting for machine to come up
	I0717 17:49:36.097264   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.097632   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.097689   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.097608   39957 retry.go:31] will retry after 593.383084ms: waiting for machine to come up
	I0717 17:49:36.692388   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:36.692779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:36.692805   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:36.692748   39957 retry.go:31] will retry after 522.894623ms: waiting for machine to come up
	I0717 17:49:37.217539   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.217939   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.217974   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.217901   39957 retry.go:31] will retry after 618.384823ms: waiting for machine to come up
	I0717 17:49:37.837779   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:37.838175   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:37.838200   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:37.838142   39957 retry.go:31] will retry after 1.091652031s: waiting for machine to come up
	I0717 17:49:38.931763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:38.932219   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:38.932247   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:38.932134   39957 retry.go:31] will retry after 1.341674427s: waiting for machine to come up
	I0717 17:49:40.275320   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:40.275792   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:40.275820   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:40.275754   39957 retry.go:31] will retry after 1.293235927s: waiting for machine to come up
	I0717 17:49:41.571340   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:41.571705   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:41.571732   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:41.571661   39957 retry.go:31] will retry after 1.542371167s: waiting for machine to come up
	I0717 17:49:43.115333   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:43.115796   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:43.115826   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:43.115760   39957 retry.go:31] will retry after 1.886589943s: waiting for machine to come up
	I0717 17:49:45.004358   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:45.004727   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:45.004763   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:45.004693   39957 retry.go:31] will retry after 2.72551249s: waiting for machine to come up
	I0717 17:49:47.733475   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:47.733874   39794 main.go:141] libmachine: (ha-333994-m02) DBG | unable to find current IP address of domain ha-333994-m02 in network mk-ha-333994
	I0717 17:49:47.733902   39794 main.go:141] libmachine: (ha-333994-m02) DBG | I0717 17:49:47.733829   39957 retry.go:31] will retry after 3.239443396s: waiting for machine to come up
	I0717 17:49:50.975432   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.975912   39794 main.go:141] libmachine: (ha-333994-m02) Found IP for machine: 192.168.39.127
	I0717 17:49:50.975930   39794 main.go:141] libmachine: (ha-333994-m02) Reserving static IP address...
	I0717 17:49:50.975960   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has current primary IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.976436   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.976461   39794 main.go:141] libmachine: (ha-333994-m02) Reserved static IP address: 192.168.39.127
	I0717 17:49:50.976480   39794 main.go:141] libmachine: (ha-333994-m02) DBG | skip adding static IP to network mk-ha-333994 - found existing host DHCP lease matching {name: "ha-333994-m02", mac: "52:54:00:b1:0f:81", ip: "192.168.39.127"}
	I0717 17:49:50.976499   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Getting to WaitForSSH function...
	I0717 17:49:50.976514   39794 main.go:141] libmachine: (ha-333994-m02) Waiting for SSH to be available...
	I0717 17:49:50.978829   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979226   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:50.979246   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:50.979387   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH client type: external
	I0717 17:49:50.979411   39794 main.go:141] libmachine: (ha-333994-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa (-rw-------)
	I0717 17:49:50.979431   39794 main.go:141] libmachine: (ha-333994-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.127 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0717 17:49:50.979444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | About to run SSH command:
	I0717 17:49:50.979455   39794 main.go:141] libmachine: (ha-333994-m02) DBG | exit 0
	I0717 17:49:51.106070   39794 main.go:141] libmachine: (ha-333994-m02) DBG | SSH cmd err, output: <nil>: 
	I0717 17:49:51.106413   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetConfigRaw
	I0717 17:49:51.106973   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.109287   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.109618   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.109826   39794 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:49:51.110023   39794 machine.go:94] provisionDockerMachine start ...
	I0717 17:49:51.110040   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.110237   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.112084   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112321   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.112346   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.112436   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.112578   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112724   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.112869   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.113027   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.113194   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.113205   39794 main.go:141] libmachine: About to run SSH command:
	hostname
	I0717 17:49:51.214365   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0717 17:49:51.214388   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214600   39794 buildroot.go:166] provisioning hostname "ha-333994-m02"
	I0717 17:49:51.214629   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.214801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.217146   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217465   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.217489   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.217600   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.217758   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.217934   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.218049   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.218223   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.218385   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.218401   39794 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-333994-m02 && echo "ha-333994-m02" | sudo tee /etc/hostname
	I0717 17:49:51.334279   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-333994-m02
	
	I0717 17:49:51.334317   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.337581   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.337905   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.337933   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.338139   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.338346   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338512   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.338693   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.338845   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:51.339025   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:51.339046   39794 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-333994-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-333994-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-333994-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0717 17:49:51.454925   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0717 17:49:51.454956   39794 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/19283-14409/.minikube CaCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/19283-14409/.minikube}
	I0717 17:49:51.454978   39794 buildroot.go:174] setting up certificates
	I0717 17:49:51.454987   39794 provision.go:84] configureAuth start
	I0717 17:49:51.454999   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetMachineName
	I0717 17:49:51.455257   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:51.457564   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.457851   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.457873   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.458013   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.459810   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460165   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.460190   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.460306   39794 provision.go:143] copyHostCerts
	I0717 17:49:51.460327   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460352   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem, removing ...
	I0717 17:49:51.460360   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem
	I0717 17:49:51.460411   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/ca.pem (1082 bytes)
	I0717 17:49:51.460474   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460493   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem, removing ...
	I0717 17:49:51.460497   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem
	I0717 17:49:51.460514   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/cert.pem (1123 bytes)
	I0717 17:49:51.460556   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460571   39794 exec_runner.go:144] found /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem, removing ...
	I0717 17:49:51.460577   39794 exec_runner.go:203] rm: /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem
	I0717 17:49:51.460593   39794 exec_runner.go:151] cp: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/19283-14409/.minikube/key.pem (1679 bytes)
	I0717 17:49:51.460641   39794 provision.go:117] generating server cert: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem org=jenkins.ha-333994-m02 san=[127.0.0.1 192.168.39.127 ha-333994-m02 localhost minikube]
	I0717 17:49:51.635236   39794 provision.go:177] copyRemoteCerts
	I0717 17:49:51.635286   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0717 17:49:51.635308   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.638002   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638369   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.638395   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.638622   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.638815   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.638982   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.639145   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.720405   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0717 17:49:51.720478   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0717 17:49:51.746352   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0717 17:49:51.746412   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0717 17:49:51.770628   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0717 17:49:51.770702   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0717 17:49:51.795258   39794 provision.go:87] duration metric: took 340.256082ms to configureAuth
	I0717 17:49:51.795284   39794 buildroot.go:189] setting minikube options for container-runtime
	I0717 17:49:51.795490   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:51.795501   39794 machine.go:97] duration metric: took 685.467301ms to provisionDockerMachine
	I0717 17:49:51.795514   39794 start.go:293] postStartSetup for "ha-333994-m02" (driver="kvm2")
	I0717 17:49:51.795528   39794 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0717 17:49:51.795563   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.795850   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0717 17:49:51.795874   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.798310   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798696   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.798719   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.798889   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.799047   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.799191   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.799286   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:51.881403   39794 ssh_runner.go:195] Run: cat /etc/os-release
	I0717 17:49:51.885516   39794 info.go:137] Remote host: Buildroot 2023.02.9
	I0717 17:49:51.885542   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/addons for local assets ...
	I0717 17:49:51.885603   39794 filesync.go:126] Scanning /home/jenkins/minikube-integration/19283-14409/.minikube/files for local assets ...
	I0717 17:49:51.885687   39794 filesync.go:149] local asset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> 216612.pem in /etc/ssl/certs
	I0717 17:49:51.885697   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /etc/ssl/certs/216612.pem
	I0717 17:49:51.885773   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0717 17:49:51.894953   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:51.919442   39794 start.go:296] duration metric: took 123.913575ms for postStartSetup
	I0717 17:49:51.919487   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:51.919775   39794 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0717 17:49:51.919801   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:51.922159   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922506   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:51.922533   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:51.922672   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:51.922878   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:51.923036   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:51.923152   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.004408   39794 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0717 17:49:52.004481   39794 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0717 17:49:52.063014   39794 fix.go:56] duration metric: took 18.171175537s for fixHost
	I0717 17:49:52.063061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.065858   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066239   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.066269   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.066459   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.066648   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066806   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.066931   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.067086   39794 main.go:141] libmachine: Using SSH client type: native
	I0717 17:49:52.067288   39794 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82da20] 0x830780 <nil>  [] 0s} 192.168.39.127 22 <nil> <nil>}
	I0717 17:49:52.067303   39794 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0717 17:49:52.166802   39794 main.go:141] libmachine: SSH cmd err, output: <nil>: 1721238592.140235525
	
	I0717 17:49:52.166826   39794 fix.go:216] guest clock: 1721238592.140235525
	I0717 17:49:52.166835   39794 fix.go:229] Guest: 2024-07-17 17:49:52.140235525 +0000 UTC Remote: 2024-07-17 17:49:52.063042834 +0000 UTC m=+40.822975139 (delta=77.192691ms)
	I0717 17:49:52.166849   39794 fix.go:200] guest clock delta is within tolerance: 77.192691ms
	I0717 17:49:52.166853   39794 start.go:83] releasing machines lock for "ha-333994-m02", held for 18.275039229s
	I0717 17:49:52.166873   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.167105   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:52.169592   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.169924   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.169948   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.172181   39794 out.go:177] * Found network options:
	I0717 17:49:52.173607   39794 out.go:177]   - NO_PROXY=192.168.39.180
	W0717 17:49:52.174972   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.175003   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175597   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175781   39794 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:49:52.175858   39794 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0717 17:49:52.175897   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	W0717 17:49:52.175951   39794 proxy.go:119] fail to check proxy env: Error ip not in block
	I0717 17:49:52.176007   39794 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0717 17:49:52.176024   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:49:52.178643   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.178748   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179072   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179098   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179230   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:52.179248   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179272   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:52.179432   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:49:52.179524   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179596   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:49:52.179664   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179721   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:49:52.179794   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:49:52.179844   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	W0717 17:49:52.256371   39794 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0717 17:49:52.256433   39794 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0717 17:49:52.287825   39794 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0717 17:49:52.287848   39794 start.go:495] detecting cgroup driver to use...
	I0717 17:49:52.287901   39794 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0717 17:49:52.316497   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0717 17:49:52.330140   39794 docker.go:217] disabling cri-docker service (if available) ...
	I0717 17:49:52.330189   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0717 17:49:52.343721   39794 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0717 17:49:52.357273   39794 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0717 17:49:52.483050   39794 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0717 17:49:52.682504   39794 docker.go:233] disabling docker service ...
	I0717 17:49:52.682571   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0717 17:49:52.702383   39794 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0717 17:49:52.717022   39794 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0717 17:49:52.851857   39794 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0717 17:49:52.989407   39794 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0717 17:49:53.003913   39794 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0717 17:49:53.024876   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0717 17:49:53.035470   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0717 17:49:53.046129   39794 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0717 17:49:53.046184   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0717 17:49:53.056553   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.067211   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0717 17:49:53.077626   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0717 17:49:53.088680   39794 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0717 17:49:53.100371   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0717 17:49:53.111920   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i '/^ *enable_unprivileged_ports = .*/d' /etc/containerd/config.toml"
	I0717 17:49:53.123072   39794 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)\[plugins."io.containerd.grpc.v1.cri"\]|&\n\1  enable_unprivileged_ports = true|' /etc/containerd/config.toml"
	I0717 17:49:53.133713   39794 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0717 17:49:53.143333   39794 crio.go:166] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0717 17:49:53.143405   39794 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0717 17:49:53.157890   39794 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0717 17:49:53.167934   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:53.302893   39794 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0717 17:49:53.333425   39794 start.go:542] Will wait 60s for socket path /run/containerd/containerd.sock
	I0717 17:49:53.333488   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:53.339060   39794 retry.go:31] will retry after 1.096332725s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0717 17:49:54.435963   39794 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0717 17:49:54.441531   39794 start.go:563] Will wait 60s for crictl version
	I0717 17:49:54.441599   39794 ssh_runner.go:195] Run: which crictl
	I0717 17:49:54.445786   39794 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0717 17:49:54.483822   39794 start.go:579] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.19
	RuntimeApiVersion:  v1
	I0717 17:49:54.483877   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.518845   39794 ssh_runner.go:195] Run: containerd --version
	I0717 17:49:54.553079   39794 out.go:177] * Preparing Kubernetes v1.30.2 on containerd 1.7.19 ...
	I0717 17:49:54.554649   39794 out.go:177]   - env NO_PROXY=192.168.39.180
	I0717 17:49:54.556061   39794 main.go:141] libmachine: (ha-333994-m02) Calling .GetIP
	I0717 17:49:54.559046   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559422   39794 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:49:54.559444   39794 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:49:54.559695   39794 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0717 17:49:54.564470   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:54.579269   39794 mustload.go:65] Loading cluster: ha-333994
	I0717 17:49:54.579483   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:54.579765   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.579792   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.594439   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39001
	I0717 17:49:54.594883   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.595350   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.595374   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.595675   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.595858   39794 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:49:54.597564   39794 host.go:66] Checking if "ha-333994" exists ...
	I0717 17:49:54.597896   39794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:49:54.597921   39794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:49:54.613634   39794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34405
	I0717 17:49:54.614031   39794 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:49:54.614493   39794 main.go:141] libmachine: Using API Version  1
	I0717 17:49:54.614511   39794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:49:54.614816   39794 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:49:54.615002   39794 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:49:54.615153   39794 certs.go:68] Setting up /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994 for IP: 192.168.39.127
	I0717 17:49:54.615165   39794 certs.go:194] generating shared ca certs ...
	I0717 17:49:54.615183   39794 certs.go:226] acquiring lock for ca certs: {Name:mkbd59c659d87951ff3ee355cd9afc07084cc973 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:49:54.615314   39794 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key
	I0717 17:49:54.615354   39794 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key
	I0717 17:49:54.615363   39794 certs.go:256] generating profile certs ...
	I0717 17:49:54.615452   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key
	I0717 17:49:54.615493   39794 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key.3a75f3ff
	I0717 17:49:54.615524   39794 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key
	I0717 17:49:54.615535   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0717 17:49:54.615548   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0717 17:49:54.615560   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0717 17:49:54.615575   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0717 17:49:54.615587   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0717 17:49:54.615599   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0717 17:49:54.615635   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0717 17:49:54.615651   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0717 17:49:54.615692   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem (1338 bytes)
	W0717 17:49:54.615716   39794 certs.go:480] ignoring /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661_empty.pem, impossibly tiny 0 bytes
	I0717 17:49:54.615731   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca-key.pem (1679 bytes)
	I0717 17:49:54.615754   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/ca.pem (1082 bytes)
	I0717 17:49:54.615774   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/cert.pem (1123 bytes)
	I0717 17:49:54.615795   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/key.pem (1679 bytes)
	I0717 17:49:54.615829   39794 certs.go:484] found cert: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem (1708 bytes)
	I0717 17:49:54.615854   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem -> /usr/share/ca-certificates/21661.pem
	I0717 17:49:54.615866   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem -> /usr/share/ca-certificates/216612.pem
	I0717 17:49:54.615877   39794 vm_assets.go:164] NewFileAsset: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:54.615902   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:49:54.618791   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619169   39794 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:25:51 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:49:54.619191   39794 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:49:54.619351   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:49:54.619524   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:49:54.619660   39794 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:49:54.619789   39794 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:49:54.694549   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0717 17:49:54.699693   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0717 17:49:54.711136   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0717 17:49:54.715759   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0717 17:49:54.727707   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0717 17:49:54.732038   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0717 17:49:54.743206   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0717 17:49:54.747536   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0717 17:49:54.759182   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0717 17:49:54.763279   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0717 17:49:54.774195   39794 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0717 17:49:54.778345   39794 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1675 bytes)
	I0717 17:49:54.790000   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0717 17:49:54.817482   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0717 17:49:54.842528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0717 17:49:54.867521   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0717 17:49:54.893528   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1432 bytes)
	I0717 17:49:54.920674   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0717 17:49:54.946673   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0717 17:49:54.972385   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0717 17:49:54.997675   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/certs/21661.pem --> /usr/share/ca-certificates/21661.pem (1338 bytes)
	I0717 17:49:55.023298   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/ssl/certs/216612.pem --> /usr/share/ca-certificates/216612.pem (1708 bytes)
	I0717 17:49:55.048552   39794 ssh_runner.go:362] scp /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0717 17:49:55.073345   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0717 17:49:55.091193   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0717 17:49:55.108383   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0717 17:49:55.125529   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0717 17:49:55.142804   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0717 17:49:55.160482   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1675 bytes)
	I0717 17:49:55.178995   39794 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0717 17:49:55.197026   39794 ssh_runner.go:195] Run: openssl version
	I0717 17:49:55.202998   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/216612.pem && ln -fs /usr/share/ca-certificates/216612.pem /etc/ssl/certs/216612.pem"
	I0717 17:49:55.214662   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219373   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Jul 17 17:21 /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.219447   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/216612.pem
	I0717 17:49:55.225441   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/216612.pem /etc/ssl/certs/3ec20f2e.0"
	I0717 17:49:55.236543   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0717 17:49:55.247672   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252336   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Jul 17 17:13 /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.252396   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0717 17:49:55.258207   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0717 17:49:55.269215   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/21661.pem && ln -fs /usr/share/ca-certificates/21661.pem /etc/ssl/certs/21661.pem"
	I0717 17:49:55.280136   39794 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284763   39794 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Jul 17 17:21 /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.284843   39794 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/21661.pem
	I0717 17:49:55.290471   39794 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/21661.pem /etc/ssl/certs/51391683.0"
	I0717 17:49:55.301174   39794 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0717 17:49:55.305201   39794 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0717 17:49:55.305253   39794 kubeadm.go:934] updating node {m02 192.168.39.127 8443 v1.30.2 containerd true true} ...
	I0717 17:49:55.305343   39794 kubeadm.go:946] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.30.2/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-333994-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.127
	
	[Install]
	 config:
	{KubernetesVersion:v1.30.2 ClusterName:ha-333994 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0717 17:49:55.305377   39794 kube-vip.go:115] generating kube-vip config ...
	I0717 17:49:55.305412   39794 ssh_runner.go:195] Run: sudo sh -c "modprobe --all ip_vs ip_vs_rr ip_vs_wrr ip_vs_sh nf_conntrack"
	I0717 17:49:55.322820   39794 kube-vip.go:167] auto-enabling control-plane load-balancing in kube-vip
	I0717 17:49:55.322885   39794 kube-vip.go:137] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_nodename
	      valueFrom:
	        fieldRef:
	          fieldPath: spec.nodeName
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.8.0
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0717 17:49:55.322938   39794 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.30.2
	I0717 17:49:55.332945   39794 binaries.go:44] Found k8s binaries, skipping transfer
	I0717 17:49:55.333009   39794 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0717 17:49:55.342555   39794 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0717 17:49:55.358883   39794 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0717 17:49:55.375071   39794 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1441 bytes)
	I0717 17:49:55.393413   39794 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0717 17:49:55.397331   39794 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0717 17:49:55.411805   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.535806   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:55.554620   39794 start.go:235] Will wait 6m0s for node &{Name:m02 IP:192.168.39.127 Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0717 17:49:55.554913   39794 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:49:55.556751   39794 out.go:177] * Verifying Kubernetes components...
	I0717 17:49:55.558066   39794 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0717 17:49:55.748334   39794 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0717 17:49:56.613699   39794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:49:56.613920   39794 kapi.go:59] client config for ha-333994: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.crt", KeyFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/client.key", CAFile:"/home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)
}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1d02420), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0717 17:49:56.613970   39794 kubeadm.go:483] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.180:8443
	I0717 17:49:56.614170   39794 node_ready.go:35] waiting up to 6m0s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:49:56.614265   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:56.614272   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:56.614280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:56.614286   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:56.627325   39794 round_trippers.go:574] Response Status: 404 Not Found in 13 milliseconds
	I0717 17:49:57.115057   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.115083   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.115091   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.115095   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.117582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:57.614333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:57.614354   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:57.614362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:57.614365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:57.616581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.117636   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.615397   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:58.615423   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:58.615434   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:58.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:58.617780   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:58.617919   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:49:59.114753   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.114774   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.116989   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:49:59.615261   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:49:59.615289   39794 round_trippers.go:469] Request Headers:
	I0717 17:49:59.615299   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:49:59.615305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:49:59.617539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.115327   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.115348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.115356   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.115359   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.117595   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:00.615335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:00.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:00.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:00.615371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:00.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:01.115332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.115352   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.115360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.115364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.118462   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:01.118555   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:01.614396   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:01.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:01.614425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:01.614429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:01.616688   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.115381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.115413   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.115424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.115429   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.117845   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:02.614519   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:02.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:02.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:02.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:02.616973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.114666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.114690   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.114706   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.114711   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.614478   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:03.614500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:03.614508   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:03.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:03.616763   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:03.616861   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:04.115079   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.115103   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.115116   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.117400   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:04.614899   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:04.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:04.614932   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:04.614936   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:04.617138   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.115001   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.115024   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.115031   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.115039   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.117375   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:05.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:05.615153   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:05.615158   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:05.617472   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:05.617581   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:06.115206   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.115235   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.115240   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.117694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:06.614430   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:06.614453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:06.614462   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:06.614467   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:06.616849   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.115357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.115378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.115386   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.117909   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:07.614460   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:07.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:07.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:07.614497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:07.617064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.115383   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.115405   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.115412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.115417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.117848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:08.117947   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:08.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:08.614415   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:08.614423   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:08.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:08.616608   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.114929   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.114950   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.114958   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.114962   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.117409   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:09.614639   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:09.614659   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:09.614666   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:09.614670   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:09.616904   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.114644   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.114668   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.114676   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.114685   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.117224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.614973   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:10.614995   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:10.615003   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:10.615007   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:10.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:10.617474   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:11.115160   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.115187   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.115197   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.117916   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:11.615031   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:11.615053   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:11.615061   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:11.615065   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:11.617581   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.115275   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.115297   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.115305   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.615329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:12.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:12.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:12.615367   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:12.617808   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:12.617929   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:13.114465   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.114488   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.114497   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.114501   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.116973   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:13.614674   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:13.614704   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:13.614715   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:13.614721   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:13.617161   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.115360   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.117798   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:14.615028   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:14.615052   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:14.615062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:14.615068   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:14.617174   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.115117   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.115140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.115149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.115154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.117832   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:15.117958   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:15.614474   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:15.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:15.614528   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:15.614534   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:15.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.114493   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.114529   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.114536   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.114540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.117140   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:16.614895   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:16.614922   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:16.614935   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:16.614943   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:16.617847   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.114480   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.114500   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.114510   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.116841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.614484   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:17.614505   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:17.614512   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:17.614515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:17.616877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:17.617049   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:18.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.115346   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.115354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.115358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.117690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:18.614346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:18.614364   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:18.614372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:18.614377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:18.617203   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.114315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.114349   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.114357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.114362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.119328   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:19.614516   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:19.614536   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:19.614544   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:19.614549   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:19.616974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:19.617173   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:20.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.114896   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.114905   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.114908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.117228   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:20.614953   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:20.614974   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:20.614981   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:20.614987   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:20.617553   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.115256   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.115288   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.115297   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.117516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:21.614470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:21.614493   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:21.614504   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:21.614512   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:21.616801   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.114458   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.114481   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.114491   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.114497   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.116704   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:22.116814   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:22.614361   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:22.614383   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:22.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:22.614395   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:22.616868   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.117765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:23.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:23.614469   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:23.614480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:23.614486   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:23.616902   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.115254   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.115277   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.115287   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.115292   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.117319   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:24.117422   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:24.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:24.614655   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:24.614665   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:24.614669   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:24.617182   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:25.115401   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.115430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.115434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.118835   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:25.614325   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:25.614351   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:25.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:25.614366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:25.616764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.114413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.114451   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.114460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.117000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.614789   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:26.614815   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:26.614826   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:26.614831   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:26.617192   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:26.617279   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:27.114863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.114888   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.114897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.114903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.117792   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:27.615352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:27.615378   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:27.615389   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:27.615394   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:27.618057   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.115330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.115353   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.115365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.117820   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:28.615355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:28.615377   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:28.615385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:28.615389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:28.619637   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:50:28.619765   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:29.114706   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.114727   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.114734   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.114738   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.117064   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:29.614803   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:29.614826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:29.614835   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:29.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:29.617436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.114527   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.114565   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:30.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:30.614542   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:30.614551   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:30.614554   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:30.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.114819   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.114856   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.114867   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.114873   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.117237   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:31.117345   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:31.615179   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:31.615203   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:31.615219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:31.615224   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:31.617525   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.115329   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.115337   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.115341   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.117639   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:32.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:32.614391   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:32.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:32.614403   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:32.617172   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.115127   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.115162   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.117796   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:33.117911   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:33.614544   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:33.614586   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:33.614597   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:33.614611   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:33.616706   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.115175   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.115197   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.117345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:34.614352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:34.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:34.614380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:34.614384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:34.616826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.114840   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.114867   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.114876   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.114881   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.117298   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:35.615140   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:35.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:35.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:35.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:35.617897   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:36.115372   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.115393   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.115402   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.115405   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.117735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:36.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:36.615376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:36.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:36.615388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:36.617891   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:37.114533   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.114567   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.114572   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:37.615384   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:37.615406   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:37.615414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:37.615417   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:37.617760   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.114425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.114448   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.114455   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.114458   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.117016   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:38.117135   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:38.614755   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:38.614779   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:38.614787   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:38.614790   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:38.617099   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.115282   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.115303   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.115311   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.115315   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.117895   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:39.614832   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:39.614853   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:39.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:39.614865   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:39.617355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.115339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.115361   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.115369   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.117661   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:40.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:40.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:40.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:40.614396   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:40.614399   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:40.616881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.114581   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.114606   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.114616   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.114622   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.116877   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:41.614884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:41.614906   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:41.614914   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:41.614919   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:41.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.115181   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.115193   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.115201   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:42.117819   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:42.614328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:42.614348   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:42.614356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:42.614361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:42.617382   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:50:43.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.115127   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.115135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.115140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.117355   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:43.615121   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:43.615142   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:43.615149   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:43.615154   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:43.617549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.114826   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.114834   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.114839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.117204   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:44.615431   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:44.615439   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:44.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:44.617856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:44.617969   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:45.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.115093   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.115110   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.117220   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:45.614988   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:45.615008   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:45.615015   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:45.615018   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:45.617421   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.115156   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.115178   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:46.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:46.615076   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:46.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:46.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:46.617407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.115203   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.115207   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.117871   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:47.117975   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:47.614555   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:47.614577   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:47.614586   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:47.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:47.617103   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.114743   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.114782   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.114787   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.116997   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:48.614683   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:48.614710   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:48.614721   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:48.614734   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:48.617185   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.115307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.115332   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.115343   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.115347   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.117646   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.614838   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:49.614858   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:49.614872   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:49.614880   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:49.617342   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:49.617440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:50.115333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.115372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.117536   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:50.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:50.615270   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:50.615278   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:50.615282   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:50.617747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.114366   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.114389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.114396   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.114400   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.116597   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:51.614367   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:51.614389   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:51.614397   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:51.614401   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:51.616747   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.114431   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.114453   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.114461   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.114464   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.117371   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:52.117470   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:52.615088   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:52.615111   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:52.615118   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:52.615122   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:52.617416   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.115173   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.115195   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.115203   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.115208   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.117683   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:53.614356   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:53.614376   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:53.614384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:53.614388   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:53.616703   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.114990   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.115013   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.115020   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.115024   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.117855   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:54.117941   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:54.615104   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:54.615125   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:54.615135   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:54.615140   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:54.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.114983   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.115012   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.117396   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:55.615131   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:55.615152   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:55.615168   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:55.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:55.617453   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.115180   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.115201   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.115209   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.117326   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.615051   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:56.615074   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:56.615082   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:56.615087   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:56.617369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:56.617480   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:57.115080   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.115102   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.115110   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.115114   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.117510   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:57.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:57.615246   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:57.615254   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:57.615258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:57.617511   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.114811   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.117265   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:58.614995   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:58.615015   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:58.615023   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:58.615028   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:58.617145   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.115342   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.115350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.115353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.117772   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:50:59.117893   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:50:59.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:50:59.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:50:59.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:50:59.614906   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:50:59.617194   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.115270   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.115304   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.117653   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:00.615357   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:00.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:00.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:00.615391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:00.617720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.114385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.114413   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.114416   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.116717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.614708   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:01.614735   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:01.614745   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:01.614751   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:01.617211   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:01.617309   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:02.114916   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.114948   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.114956   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.114965   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.117244   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:02.614964   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:02.614987   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:02.614995   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:02.614999   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:02.617512   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.115239   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.115251   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.117907   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:03.614525   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:03.614547   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:03.614557   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:03.614561   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:03.621322   39794 round_trippers.go:574] Response Status: 404 Not Found in 6 milliseconds
	I0717 17:51:03.621424   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:04.114491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.114513   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.116543   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:04.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:04.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:04.614699   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:04.614705   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:04.616831   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.114969   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.114996   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.115003   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.115008   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.117465   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:05.615208   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:05.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:05.615240   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:05.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:05.617689   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.114340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.114360   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.114368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.114372   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.116445   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:06.116590   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:06.615129   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:06.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:06.615165   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:06.615172   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:06.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.115324   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.117841   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:07.614530   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:07.614557   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:07.614566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:07.614570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:07.617073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.114714   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.114750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.114756   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.117056   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:08.117161   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:08.615333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:08.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:08.615360   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:08.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:08.617848   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:09.114938   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.114965   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.114974   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.114980   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.118060   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:09.615157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:09.615177   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:09.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:09.615192   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:09.617894   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.115084   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.115104   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.115112   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.115120   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.117391   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:10.117508   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:10.615120   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:10.615145   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:10.615155   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:10.615161   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:10.617842   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.114485   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.114507   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.114515   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.114520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.117245   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:11.615400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:11.615426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:11.615437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:11.615444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:11.617790   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.115351   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.117803   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:12.117915   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:12.614461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:12.614485   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:12.614495   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:12.614500   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:12.617208   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.114980   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.115005   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.115016   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.115020   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.117385   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:13.615122   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:13.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:13.615160   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:13.615166   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:13.617805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.115212   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.115244   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.115253   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.115258   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.117528   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.614681   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:14.614701   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:14.614711   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:14.614717   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:14.617113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:14.617211   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:15.115267   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.115291   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.115302   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.115309   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.117537   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:15.615307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:15.615331   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:15.615340   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:15.615345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:15.617660   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.115400   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.115426   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.115437   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.115444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.118040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.614666   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:16.614688   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:16.614698   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:16.614703   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:16.617162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:16.617258   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:17.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.114853   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.114868   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.117547   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:17.615274   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:17.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:17.615316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:17.615323   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:17.617344   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.115064   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.115086   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.115097   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.115101   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.117232   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.614999   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:18.615021   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:18.615032   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:18.615037   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:18.617285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:18.617392   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:19.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.114407   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.114451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.117257   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:19.615315   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:19.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:19.615344   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:19.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:19.617155   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:20.115264   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.115284   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.115292   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.115296   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.117412   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:20.615133   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:20.615154   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:20.615162   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:20.615165   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:20.616967   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:21.114603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.114639   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.114648   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.114655   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.116866   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:21.116957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:21.614816   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:21.614841   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:21.614850   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:21.614854   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:21.617362   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.115139   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.115162   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.115170   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.115174   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.117729   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:22.614412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:22.614434   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:22.614440   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:22.614444   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:22.617178   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.114352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.114377   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.114388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.114392   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.116563   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.615345   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:23.615372   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:23.615380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:23.615383   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:23.618002   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:23.618112   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:24.115378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.115401   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.115418   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.117758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:24.614891   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:24.614912   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:24.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:24.614926   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:24.617332   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.115412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.115436   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.115445   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.115448   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.117910   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:25.614339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:25.614363   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:25.614371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:25.614375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:25.617451   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:26.115183   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.115207   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.115219   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.115225   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.117163   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:26.117274   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:26.614942   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:26.614966   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:26.614977   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:26.614984   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:26.617676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.115347   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.115370   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.115380   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.117861   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:27.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:27.615350   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:27.615359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:27.615363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:27.618250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.114551   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.114569   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.114577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.114583   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.117333   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:28.117440   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:28.615148   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:28.615180   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:28.615191   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:28.615196   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:28.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:29.114764   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.114789   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.114800   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.114804   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.116808   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:29.615144   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:29.615168   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:29.615180   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:29.615195   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:29.617588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.114670   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.114678   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.116515   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:30.615245   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:30.615265   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:30.615273   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:30.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:30.617998   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:30.618150   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:31.115373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.115395   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.115403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.115407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.117657   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:31.614754   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:31.614781   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:31.614789   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:31.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:31.616938   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.115334   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.115357   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.115370   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.117890   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:32.614529   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:32.614551   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:32.614559   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:32.614563   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:32.617063   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.114739   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.114762   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.114769   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.114773   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.116876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:33.116968   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:33.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:33.614566   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:33.614574   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:33.614579   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:33.616992   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.115382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.115403   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.115411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.115414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.117715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:34.614863   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:34.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:34.614888   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:34.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:34.617243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.115352   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.115375   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.115391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.117853   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:35.117957   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:35.614511   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:35.614533   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:35.614541   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:35.614547   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:35.617000   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.114661   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.114682   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.114690   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.114695   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.117055   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:36.614872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:36.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:36.614903   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:36.614908   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:36.617081   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.114747   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.117323   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.615053   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:37.615075   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:37.615086   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:37.615094   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:37.617571   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:37.617677   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:38.115271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.115293   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.115301   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.115305   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.117337   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:38.615114   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:38.615136   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:38.615143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:38.615146   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:38.617524   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.114717   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.114726   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.114731   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.116906   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:39.615059   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:39.615078   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:39.615086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:39.615090   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:39.617554   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:40.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.114645   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.114655   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.114659   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.116637   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:40.116742   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:40.615346   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:40.615368   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:40.615379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:40.615385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:40.617774   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.114470   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.114474   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.116924   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:41.614862   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:41.614882   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:41.614890   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:41.614893   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:41.617121   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.114844   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.114871   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.114880   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.114887   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.117456   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:42.117549   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:42.615184   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:42.615219   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:42.615228   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:42.615231   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:42.617697   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.115377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.117888   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:43.614542   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:43.614564   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:43.614572   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:43.614575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:43.617156   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.114390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.114418   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.114430   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.116806   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.614781   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:44.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:44.614808   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:44.614813   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:44.616969   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:44.617103   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:45.115008   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.115031   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.117431   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:45.615224   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:45.615252   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:45.615262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:45.615266   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:45.617533   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.115209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.115230   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.115243   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.118193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.614898   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:46.614921   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:46.614928   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:46.614932   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:46.617234   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:46.617429   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:47.115009   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.115032   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.115040   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.115044   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.117484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:47.615213   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:47.615236   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:47.615245   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:47.615249   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:47.617602   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.115343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.117939   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:48.614599   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:48.614625   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:48.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:48.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:48.617112   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.115322   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.115343   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.115351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.115356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.117738   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:49.117854   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:49.614434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:49.614465   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:49.614475   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:49.614479   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:49.617641   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:50.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.115370   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.117407   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:50.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:50.615340   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:50.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:50.615353   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:50.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.114376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.114398   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.114407   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.114414   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.116810   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.614799   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:51.614831   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:51.614839   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:51.614844   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:51.617260   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:51.617398   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:52.115069   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.115094   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.115102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.115108   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.117538   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:52.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:52.615352   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:52.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:52.615365   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:52.617834   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.114486   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.114512   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.114527   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.118242   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:51:53.615003   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:53.615034   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:53.615045   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:53.615051   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:53.617718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:53.617826   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:54.115063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.115091   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.115100   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.115105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.117425   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:54.615271   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:54.615295   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:54.615304   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:54.615309   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:54.617987   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:55.115096   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.115119   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.115127   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.115131   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:51:55.614857   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:55.614881   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:55.614897   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:55.614903   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:55.617711   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.115349   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.115357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.115361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.118008   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:56.118139   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:56.614719   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:56.614745   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:56.614752   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:56.614756   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:56.617529   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.115310   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.115318   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.115321   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.117714   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:57.614495   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:57.614517   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:57.614525   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:57.614528   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:57.616925   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.114573   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.114598   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.114609   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.114613   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.116783   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.614438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:58.614459   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:58.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:58.614476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:58.616851   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:58.616956   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:51:59.115030   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.115055   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.115066   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.115073   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:51:59.615128   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:51:59.615151   39794 round_trippers.go:469] Request Headers:
	I0717 17:51:59.615159   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:51:59.615164   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:51:59.617627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.114672   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.114694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.114702   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.114706   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.117073   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.614975   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:00.614999   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:00.615009   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:00.615014   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:00.617143   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:00.617251   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:01.114805   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.114842   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.114852   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.114858   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.117434   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:01.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:01.614440   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:01.614448   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:01.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:01.617018   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.114693   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.114715   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.114722   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.114727   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.116963   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:02.614625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:02.614650   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:02.614660   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:02.614664   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:02.617042   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.114775   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.116932   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:03.117041   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:03.614597   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:03.614618   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:03.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:03.614630   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:03.616748   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.115018   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.115039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.115049   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.115053   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.117556   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:04.615321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:04.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:04.615361   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:04.615368   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:04.617694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.114830   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.114857   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.114865   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.114869   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.117278   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:05.117380   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:05.615000   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:05.615035   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:05.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:05.615052   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:05.617339   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.115037   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.115056   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.115062   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.115066   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.117588   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:06.614309   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:06.614333   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:06.614341   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:06.614346   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:06.616516   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.115312   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.115336   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.115345   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.115349   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:07.117714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:07.615376   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:07.615398   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:07.615406   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:07.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:07.617826   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.114477   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.114499   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.114507   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.114511   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.116889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:08.614611   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:08.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:08.614649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:08.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:08.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.115169   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.115191   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.115199   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.115202   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.117574   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.615328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:09.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:09.615357   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:09.615361   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:09.617889   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:09.618007   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:10.115232   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.115254   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.115262   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.115268   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.117721   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:10.614358   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:10.614381   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:10.614388   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:10.614391   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:10.616539   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.115338   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.115365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.115377   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.115384   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.117600   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:11.614501   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:11.614525   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:11.614535   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:11.614539   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:11.616883   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.114544   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.114552   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.114557   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.117075   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:12.117189   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:12.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:12.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:12.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:12.614866   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:12.617132   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.114797   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.114818   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.114830   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.114835   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.117193   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:13.614859   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:13.614880   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:13.614887   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:13.614891   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:13.617224   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.114680   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.114701   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.114708   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.114713   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.117640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:14.117759   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:14.615371   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:14.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:14.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:14.615412   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:14.617899   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.115288   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.115307   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.115316   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.115320   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.117625   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:15.615379   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:15.615399   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:15.615407   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:15.615410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:15.617678   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.115335   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.115358   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.115368   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.115373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.117508   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.615332   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:16.615355   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:16.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:16.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:16.617762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:16.617852   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:17.115342   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.115374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.115380   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.117745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:17.614381   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:17.614404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:17.614411   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:17.614414   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:17.616676   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:18.114344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.114365   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.114372   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.114377   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.116126   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:18.614823   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:18.614850   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:18.614859   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:18.614863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:18.617249   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.114382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.114404   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.114422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.116549   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:19.116667   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:19.615132   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:19.615157   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:19.615166   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:19.615171   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:19.617897   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:20.115394   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.115422   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.115438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.120626   39794 round_trippers.go:574] Response Status: 404 Not Found in 5 milliseconds
	I0717 17:52:20.615314   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:20.615335   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:20.615343   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:20.615348   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:20.617815   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.114476   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.114497   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.114509   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.114516   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.116694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:21.116789   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:21.614568   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:21.614590   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:21.614596   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:21.614600   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:21.616740   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.114442   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.114465   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.114477   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.116620   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:22.615373   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:22.615414   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:22.615422   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:22.615425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:22.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.115377   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.115385   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.115390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.117793   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:23.117961   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:23.614462   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:23.614484   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:23.614492   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:23.614495   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:23.616758   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.115153   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.115174   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.115183   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.115187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.117485   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:24.615251   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:24.615278   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:24.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:24.615294   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:24.618155   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.114648   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.114656   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.114660   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.117162   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.614843   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:25.614863   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:25.614871   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:25.614875   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:25.616943   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:25.617057   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:26.114625   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.114665   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.114677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.114681   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.116743   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:26.614490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:26.614512   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:26.614521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:26.614524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:26.616812   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.115366   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.115375   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.115379   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.117751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:27.614385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:27.614429   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:27.614436   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:27.614440   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:27.616766   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.114438   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.114463   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.114472   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.114476   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.116881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:28.116995   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:28.614550   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:28.614573   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:28.614583   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:28.614589   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:28.616576   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:29.114665   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.114688   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.114697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.114701   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.116949   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:29.614618   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:29.614639   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:29.614647   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:29.614652   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:29.617229   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.114692   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.114711   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.114725   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.116453   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:52:30.615200   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:30.615233   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:30.615241   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:30.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:30.617947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:30.618078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:31.114620   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.114663   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.114674   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.114677   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.116821   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:31.614807   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:31.614849   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:31.614857   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:31.614861   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:31.617107   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.114733   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.114772   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.114780   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.114784   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.117117   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:32.614873   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:32.614895   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:32.614906   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:32.614913   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:32.617084   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.114744   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.114767   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.114774   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.116968   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:33.117056   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:33.614614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:33.614634   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:33.614642   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:33.614648   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:33.616694   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.114989   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.115010   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.115019   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.115023   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.117256   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:34.615017   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:34.615039   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:34.615046   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:34.615049   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:34.617305   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.114707   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.114729   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.114737   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.114741   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.116837   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.614518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:35.614541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:35.614549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:35.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:35.617169   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:35.617264   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:36.114880   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.114903   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.114912   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.114915   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.117413   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:36.615154   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:36.615178   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:36.615186   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:36.615189   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:36.617681   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.114404   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.114427   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.114435   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.114438   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.116709   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:37.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:37.614444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:37.614452   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:37.614465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:37.616814   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.114522   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.114550   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.114560   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.114566   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.117012   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:38.117111   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:38.614715   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:38.614738   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:38.614746   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:38.614750   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:38.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.115300   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.115321   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.115330   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.117647   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:39.615387   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:39.615412   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:39.615418   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:39.615422   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:39.617840   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.114520   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.114548   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.114553   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.116874   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.614642   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:40.614667   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:40.614677   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:40.614682   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:40.617201   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:40.617299   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:41.114884   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.114913   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.114925   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.114930   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.117705   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:41.614760   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:41.614784   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:41.614793   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:41.614799   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:41.617304   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.115055   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.115077   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.115086   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.115092   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.117464   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.615207   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:42.615231   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:42.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:42.615246   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:42.617788   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:42.617906   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:43.114443   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.114471   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.114484   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.114489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.116804   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:43.614503   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:43.614534   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:43.614546   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:43.614553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:43.616923   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.114333   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.114362   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.114371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.114376   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.116593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:44.615353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:44.615375   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:44.615383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:44.615387   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:44.619020   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:44.619252   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:45.114535   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.114558   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.114565   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.114568   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.116805   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:45.614455   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:45.614477   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:45.614485   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:45.614489   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:45.616531   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.115306   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.115327   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.115334   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.115340   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.117430   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:46.615326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:46.615349   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:46.615358   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:46.615364   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:46.617638   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.115375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.115397   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.115405   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.115410   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.117966   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:47.118069   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:47.614605   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:47.614627   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:47.614635   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:47.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:47.617373   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.115142   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.115164   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.115173   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.115177   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.117353   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:48.615075   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:48.615097   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:48.615105   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:48.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:48.617317   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.114470   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.114492   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.114501   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.114506   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.116813   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.615412   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:49.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:49.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:49.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:49.617717   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:49.617816   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:50.115355   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.115376   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.115384   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.115389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.117802   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:50.614440   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:50.614462   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:50.614469   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:50.614474   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:50.616542   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:51.115295   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.115318   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.115325   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.115329   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.118739   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:51.614657   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:51.614694   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:51.614703   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:51.614708   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:51.616892   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.114541   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.114568   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.114575   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.114578   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.117054   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:52.117156   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:52.614718   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:52.614748   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:52.614759   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:52.614765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:52.617263   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.114959   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.114984   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.114996   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.115000   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.117274   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:53.615035   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:53.615060   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:53.615070   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:53.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:53.617250   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.114646   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.114679   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.114686   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.114694   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.116952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.614585   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:54.614604   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:54.614612   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:54.614615   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:54.616959   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:54.617087   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:55.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.114543   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.114550   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.114556   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.117176   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:55.614804   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:55.614830   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:55.614843   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:55.614848   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:55.617029   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.114710   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.114739   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.114750   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.114757   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.117352   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.615042   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:56.615064   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:56.615072   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:56.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:56.617503   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:56.617629   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:57.115247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.115273   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.115283   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.115289   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.119157   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:52:57.614778   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:57.614799   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:57.614808   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:57.614812   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:57.617771   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.114423   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.114444   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.114451   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.114455   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.116940   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:58.614594   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:58.614616   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:58.614626   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:58.614631   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:58.616901   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.114914   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.114934   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.114942   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.114945   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.117144   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:52:59.117235   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:52:59.614791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:52:59.614814   39794 round_trippers.go:469] Request Headers:
	I0717 17:52:59.614822   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:52:59.614827   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:52:59.617115   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.115321   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.115354   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.115362   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.115366   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.117649   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:00.615378   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:00.615400   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:00.615411   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:00.615416   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:00.617719   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.114375   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.114397   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.114404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.114408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:01.614966   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:01.614991   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:01.615002   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:01.615011   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:01.618973   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:01.619078   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:02.114685   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.114710   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.114718   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.114723   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.117526   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:02.615258   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:02.615281   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:02.615289   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:02.615293   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:02.617822   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.115326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.115355   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.115366   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.115371   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.117667   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:03.615340   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:03.615365   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:03.615374   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:03.615379   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:03.617818   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.115204   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.115226   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.115234   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.115238   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.117764   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:04.117866   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:04.615339   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:04.615357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:04.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:04.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:04.617952   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.114451   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.114472   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.114480   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.114484   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.116809   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:05.614454   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:05.614475   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:05.614482   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:05.614487   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:05.616856   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.114518   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.114541   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.114549   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.114553   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.117433   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.615116   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:06.615137   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:06.615145   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:06.615149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:06.617328   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:06.617423   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:07.115073   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.115096   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.115105   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.115109   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.117243   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:07.614957   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:07.614980   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:07.614988   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:07.614992   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:07.617455   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.115203   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.115228   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.115237   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.115242   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.117953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:08.614601   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:08.614621   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:08.614627   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:08.614632   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:08.616977   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.115171   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.115192   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.115200   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.115204   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.117505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:09.117620   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:09.615237   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:09.615259   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:09.615266   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:09.615270   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:09.617567   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.115157   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.115180   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.115188   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.115191   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.117490   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:10.615247   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:10.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:10.615277   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:10.615280   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:10.618489   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.115353   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.115374   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.115382   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.115385   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.118557   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:11.118654   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:11.614419   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:11.614439   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:11.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:11.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:11.616736   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.114441   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.114467   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.114475   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.114479   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.117113   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:12.615359   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:12.615379   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:12.615387   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:12.615390   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:12.617471   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.115196   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.115221   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.115230   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.115235   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.117548   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.615239   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:13.615269   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:13.615279   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:13.615285   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:13.617765   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:13.617868   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:14.115201   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.115222   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.115230   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.115238   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.118205   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:14.614910   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:14.614930   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:14.614941   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:14.614946   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:14.617345   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.114915   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.114940   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.114953   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.114959   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.117285   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.615063   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:15.615091   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:15.615102   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:15.615109   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:15.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:15.617892   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:16.114326   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.114345   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.114353   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.114358   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.116687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:16.614425   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:16.614445   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:16.614456   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:16.614463   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:16.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.115235   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.115266   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.115275   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.115281   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.117592   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:17.615370   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:17.615394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:17.615403   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:17.615408   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:17.617640   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.115421   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.115449   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.115460   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.115466   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.117540   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:18.117666   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:18.615244   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:18.615268   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:18.615280   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:18.615285   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:18.617069   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:19.115249   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.115272   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.115282   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.115288   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.117713   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:19.614391   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:19.614427   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:19.614435   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:19.614439   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:19.616687   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:20.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.115251   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.115255   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.119958   39794 round_trippers.go:574] Response Status: 404 Not Found in 4 milliseconds
	I0717 17:53:20.120050   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:20.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:20.614641   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:20.614651   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:20.614658   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:20.617751   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:21.115329   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.115350   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.115362   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.118322   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:21.615343   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:21.615364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:21.615373   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:21.615376   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:21.617662   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.114307   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.114356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.114367   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.114373   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.116718   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.614407   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:22.614436   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:22.614447   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:22.614452   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:22.616582   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:22.616699   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:23.115301   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.115323   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.115331   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.115335   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.117744   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:23.614413   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:23.614437   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:23.614447   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:23.614453   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:23.616559   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.115103   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.115133   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.115147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.117693   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.614545   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:24.614569   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:24.614577   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:24.614581   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:24.617065   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:24.617179   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:25.114461   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.114485   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.114493   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.114496   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.116786   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:25.614416   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:25.614438   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:25.614446   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:25.614451   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:25.616751   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.114388   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.114410   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.114417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.114421   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.116745   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.614603   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:26.614626   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:26.614634   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:26.614639   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:26.617102   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:26.617207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:27.114783   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.114807   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.114818   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.114826   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.117702   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:27.614374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:27.614412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:27.614420   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:27.614425   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:27.616497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.115222   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.115243   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.115250   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.115254   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.615319   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:28.615342   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:28.615350   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:28.615354   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:28.617775   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:28.617869   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:29.114872   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.114893   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.114901   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.114907   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.116856   39794 round_trippers.go:574] Response Status: 404 Not Found in 1 milliseconds
	I0717 17:53:29.615278   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:29.615300   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:29.615308   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:29.615313   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:29.617690   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.115328   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.115351   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.115359   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.115363   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.117881   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:30.614571   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:30.614593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:30.614601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:30.614605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:30.617497   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.115219   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.115240   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.115247   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.115252   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.117580   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:31.117691   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:31.614491   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:31.614514   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:31.614520   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:31.614525   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:31.616752   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.114434   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.114457   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.114465   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.114469   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.116843   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:32.614510   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:32.614531   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:32.614537   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:32.614540   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:32.617151   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.114829   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.114852   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.114859   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.114863   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.117627   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:33.117746   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:33.615336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:33.615356   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:33.615365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:33.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:33.617473   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.114732   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.114770   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.114783   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.114788   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.117561   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:34.615316   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:34.615341   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:34.615351   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:34.615356   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:34.618153   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.114569   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.114593   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.114601   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.114605   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.116953   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.614348   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:35.614373   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:35.614383   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:35.614389   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:35.617139   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:35.617237   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:36.114791   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.114812   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.114819   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.114823   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.117593   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:36.615382   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:36.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:36.615417   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:36.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:36.618040   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.114722   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.114753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.114761   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.114765   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.116947   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:37.614643   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:37.614686   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:37.614697   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:37.614702   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:37.616876   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.114536   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.114559   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.114566   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.114570   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.117369   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:38.117462   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:38.615126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:38.615148   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:38.615156   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:38.615160   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:38.617869   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.115081   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.115113   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.115122   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.115126   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.117948   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:39.614619   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:39.614647   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:39.614659   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:39.614665   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:39.617484   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.115106   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.115131   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.115143   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.115149   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.117287   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.615033   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:40.615059   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:40.615071   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:40.615076   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:40.617572   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:40.617676   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:41.115286   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.115309   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.115316   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.115321   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.117762   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:41.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:41.614734   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:41.614743   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:41.614747   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:41.617493   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.115269   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.115292   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.115303   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.115308   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.117720   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:42.614392   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:42.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:42.614427   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:42.614434   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:42.616931   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.115385   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.115412   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.115425   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.115433   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.118066   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:43.118207   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:43.614713   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:43.614753   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:43.614765   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:43.614770   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:43.617067   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.114374   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.114406   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.114415   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.114419   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.116619   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:44.615405   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:44.615433   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:44.615441   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:44.615445   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:44.617626   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.115126   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.115150   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.115158   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.115163   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.117350   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.615112   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:45.615135   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:45.615142   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:45.615147   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:45.617618   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:45.617714   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:46.115344   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.115364   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.115371   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.115374   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.117523   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:46.615363   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:46.615386   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:46.615394   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:46.615398   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:46.617675   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.114336   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.114357   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.114365   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.114369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.116450   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.615209   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:47.615232   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:47.615242   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:47.615248   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:47.617669   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:47.617889   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:48.114456   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.114479   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.114488   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.114491   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.116715   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:48.614390   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:48.614416   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:48.614424   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:48.614427   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:48.616735   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.114828   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.114850   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.114858   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.114863   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.117111   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:49.614976   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:49.614997   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:49.615005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:49.615010   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:49.617505   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.115004   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.115026   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.115033   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.115038   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.117347   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:50.117441   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:50.615143   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:50.615170   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:50.615179   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:50.615187   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:50.617427   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.115170   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.115193   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.115205   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.115213   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.117289   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:51.615380   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:51.615407   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:51.615419   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:51.615426   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:51.618038   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.114724   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.114760   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.114773   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.114779   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.117189   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.614887   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:52.614911   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:52.614922   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:52.614927   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:52.617222   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:52.617335   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:53.114967   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.114994   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.115005   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.115013   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.117578   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:53.614368   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:53.614394   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:53.614404   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:53.614412   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:53.617467   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:54.114883   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.114906   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.114915   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.114921   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.117603   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.615330   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:54.615353   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:54.615364   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:54.615369   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:54.618101   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:54.618221   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:55.114614   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.114640   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.114649   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.114656   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.117436   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:55.615236   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:55.615260   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:55.615270   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:55.615276   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:55.617974   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.114490   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.114511   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.114521   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.114524   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.117090   39794 round_trippers.go:574] Response Status: 404 Not Found in 2 milliseconds
	I0717 17:53:56.614907   39794 round_trippers.go:463] GET https://192.168.39.180:8443/api/v1/nodes/ha-333994-m02
	I0717 17:53:56.614932   39794 round_trippers.go:469] Request Headers:
	I0717 17:53:56.614943   39794 round_trippers.go:473]     Accept: application/json, */*
	I0717 17:53:56.614948   39794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0717 17:53:56.618676   39794 round_trippers.go:574] Response Status: 404 Not Found in 3 milliseconds
	I0717 17:53:56.618791   39794 node_ready.go:53] error getting node "ha-333994-m02": nodes "ha-333994-m02" not found
	I0717 17:53:56.618808   39794 node_ready.go:38] duration metric: took 4m0.004607374s for node "ha-333994-m02" to be "Ready" ...
	I0717 17:53:56.620932   39794 out.go:177] 
	W0717 17:53:56.622268   39794 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0717 17:53:56.622282   39794 out.go:239] * 
	W0717 17:53:56.623241   39794 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0717 17:53:56.625101   39794 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	4c2118d2ed18a       6e38f40d628db       3 minutes ago       Running             storage-provisioner       2                   700d9f5e713d3       storage-provisioner
	dd5e8f56c4264       5cc3abe5717db       4 minutes ago       Running             kindnet-cni               1                   dbdf19f96898d       kindnet-5zksq
	b50ede0dde503       cbb01a7bd410d       4 minutes ago       Running             coredns                   1                   4c25cc8ac2148       coredns-7db6d8ff4d-n4xtd
	b27c10fa3251b       8c811b4aec35f       4 minutes ago       Running             busybox                   1                   c15a92e53e40d       busybox-fc5497c4f-5ngfp
	85983f98f84b9       cbb01a7bd410d       4 minutes ago       Running             coredns                   1                   507cc72648f25       coredns-7db6d8ff4d-sh96r
	603ad8840c526       6e38f40d628db       4 minutes ago       Exited              storage-provisioner       1                   700d9f5e713d3       storage-provisioner
	cede48d48fe27       53c535741fb44       4 minutes ago       Running             kube-proxy                1                   1b59105c6df2e       kube-proxy-jlzt5
	7f7ede089f3e7       7820c83aa1394       4 minutes ago       Running             kube-scheduler            1                   903065308cbb5       kube-scheduler-ha-333994
	38a3e6e69ce36       e874818b3caac       4 minutes ago       Running             kube-controller-manager   1                   bfcca696b5273       kube-controller-manager-ha-333994
	3c3e7888bdfe6       56ce0fd9fb532       4 minutes ago       Running             kube-apiserver            1                   2a8a2b0c39cd0       kube-apiserver-ha-333994
	41d1b53347d3e       3861cfcd7c04c       4 minutes ago       Running             etcd                      1                   7982d05a46241       etcd-ha-333994
	529be299dc3b8       38af8ddebf499       4 minutes ago       Running             kube-vip                  0                   fb62346baad47       kube-vip-ha-333994
	db107babf5b82       8c811b4aec35f       26 minutes ago      Exited              busybox                   0                   d9ed5134ca786       busybox-fc5497c4f-5ngfp
	dcb6f2bdfe23d       cbb01a7bd410d       27 minutes ago      Exited              coredns                   0                   3e096287e39aa       coredns-7db6d8ff4d-n4xtd
	5e03d17e52e34       cbb01a7bd410d       27 minutes ago      Exited              coredns                   0                   a55470f3593c5       coredns-7db6d8ff4d-sh96r
	f1b88563e61d6       5cc3abe5717db       27 minutes ago      Exited              kindnet-cni               0                   18bb6baa955c0       kindnet-5zksq
	0a2a73f6200a3       53c535741fb44       27 minutes ago      Exited              kube-proxy                0                   44d5a25817f0f       kube-proxy-jlzt5
	d3a0374a88e2c       56ce0fd9fb532       28 minutes ago      Exited              kube-apiserver            0                   69d556e9fd975       kube-apiserver-ha-333994
	2f62c96e1a784       7820c83aa1394       28 minutes ago      Exited              kube-scheduler            0                   14cc4b6f0a671       kube-scheduler-ha-333994
	5f332be219358       3861cfcd7c04c       28 minutes ago      Exited              etcd                      0                   2fa30f34188fb       etcd-ha-333994
	515c5ff9f46da       e874818b3caac       28 minutes ago      Exited              kube-controller-manager   0                   800370bd69668       kube-controller-manager-ha-333994
	
	
	==> containerd <==
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.464434972Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.673549472Z" level=info msg="RemoveContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\""
	Jul 17 17:50:20 ha-333994 containerd[839]: time="2024-07-17T17:50:20.682188663Z" level=info msg="RemoveContainer for \"86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.314045705Z" level=info msg="RemoveContainer for \"2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.319121815Z" level=info msg="RemoveContainer for \"2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320511033Z" level=info msg="StopPodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320605313Z" level=info msg="TearDown network for sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320616460Z" level=info msg="StopPodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.320971991Z" level=info msg="RemovePodSandbox for \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.321016823Z" level=info msg="Forcibly stopping sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.321072160Z" level=info msg="TearDown network for sandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.325612741Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.325748048Z" level=info msg="RemovePodSandbox \"4ae1e67fc3bab5bbd9a5e5575cb054716cb84745a6c3f9dcbd0081499baa6010\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326267222Z" level=info msg="StopPodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326463624Z" level=info msg="TearDown network for sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326510323Z" level=info msg="StopPodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" returns successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326827690Z" level=info msg="RemovePodSandbox for \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326922590Z" level=info msg="Forcibly stopping sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\""
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.326997124Z" level=info msg="TearDown network for sandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" successfully"
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.331124459Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Jul 17 17:50:33 ha-333994 containerd[839]: time="2024-07-17T17:50:33.331204383Z" level=info msg="RemovePodSandbox \"08971202a22cca0001836ef30528c1ddd623e32298e96aa9b8ee8badacfa299b\" returns successfully"
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.387511700Z" level=info msg="CreateContainer within sandbox \"700d9f5e713d3946ac2752599935acff0c22e7d5b1d38328f08b4514902b10af\" for container &ContainerMetadata{Name:storage-provisioner,Attempt:2,}"
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.414846958Z" level=info msg="CreateContainer within sandbox \"700d9f5e713d3946ac2752599935acff0c22e7d5b1d38328f08b4514902b10af\" for &ContainerMetadata{Name:storage-provisioner,Attempt:2,} returns container id \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\""
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.415806226Z" level=info msg="StartContainer for \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\""
	Jul 17 17:50:36 ha-333994 containerd[839]: time="2024-07-17T17:50:36.483461513Z" level=info msg="StartContainer for \"4c2118d2ed18a639a0293e3837cbc5c0b1325b3c7d157000e012d34faeddd714\" returns successfully"
	
	
	==> coredns [5e03d17e52e34f0695bfa49800923a86525fd46883d344192dfddffda1bb3e8a] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:45601 - 22388 "HINFO IN 667985956384862735.408586044970053011. udp 55 false 512" NXDOMAIN qr,rd,ra 55 0.010632325s
	[INFO] 10.244.0.4:39902 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.001112995s
	[INFO] 10.244.0.4:36119 - 3 "AAAA IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 31 0.006211328s
	[INFO] 10.244.0.4:35643 - 4 "A IN kubernetes.io. udp 31 false 512" NOERROR qr,rd,ra 60 0.002998741s
	[INFO] 10.244.0.4:48034 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000130632s
	[INFO] 10.244.0.4:36473 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.009192909s
	[INFO] 10.244.0.4:56014 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000187935s
	[INFO] 10.244.0.4:46499 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000109005s
	[INFO] 10.244.0.4:54296 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003354346s
	[INFO] 10.244.0.4:37513 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000159081s
	[INFO] 10.244.0.4:40983 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000203833s
	[INFO] 10.244.0.4:55998 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000218974s
	[INFO] 10.244.0.4:35414 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000163846s
	
	
	==> coredns [85983f98f84b97a11a481548c17b6e998bfec291ea5b38640a0522d82a174e86] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:32930 - 39231 "HINFO IN 1138402013862295929.6773124709558145559. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.011527303s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[649992777]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.508) (total time: 30004ms):
	Trace[649992777]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30004ms (17:50:20.513)
	Trace[649992777]: [30.004346914s] [30.004346914s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[119638294]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.509) (total time: 30004ms):
	Trace[119638294]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30003ms (17:50:20.512)
	Trace[119638294]: [30.004435266s] [30.004435266s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1087831118]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.513) (total time: 30001ms):
	Trace[1087831118]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:50:20.514)
	Trace[1087831118]: [30.001558122s] [30.001558122s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [b50ede0dde50338ef9fddc834d572f0d265fdc75b3a6e0ffab0b3a090f0cfac9] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:35715 - 11457 "HINFO IN 3013652693694148412.8082718229865211359. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.009035708s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1696274823]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.643) (total time: 30002ms):
	Trace[1696274823]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30001ms (17:50:20.645)
	Trace[1696274823]: [30.002410627s] [30.002410627s] END
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[990945787]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.645) (total time: 30001ms):
	Trace[990945787]: ---"Objects listed" error:Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:50:20.645)
	Trace[990945787]: [30.00126887s] [30.00126887s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.96.0.1:443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[INFO] plugin/kubernetes: Trace[1760112988]: "Reflector ListAndWatch" name:pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231 (17-Jul-2024 17:49:50.646) (total time: 30000ms):
	Trace[1760112988]: ---"Objects listed" error:Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout 30000ms (17:50:20.646)
	Trace[1760112988]: [30.000893639s] [30.000893639s] END
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.EndpointSlice: failed to list *v1.EndpointSlice: Get "https://10.96.0.1:443/apis/discovery.k8s.io/v1/endpointslices?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	[ERROR] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.27.4/tools/cache/reflector.go:231: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://10.96.0.1:443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [dcb6f2bdfe23d3e6924f51ebb8a33d8431d3ee154daf348c93ed18f38d0c971f] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.11.1
	linux/amd64, go1.20.7, ae2bbc2
	[INFO] 127.0.0.1:37241 - 12580 "HINFO IN 7703422814786955468.6939822740795333208. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.008540763s
	[INFO] 10.244.0.4:40693 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.063212279s
	[INFO] 10.244.0.4:33058 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000224675s
	[INFO] 10.244.0.4:59547 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000237944s
	[INFO] 10.244.0.4:52878 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000121777s
	[INFO] 10.244.0.4:33742 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000168604s
	[INFO] 10.244.0.4:54617 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000233778s
	[INFO] 10.244.0.4:45070 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000223029s
	[INFO] 10.244.0.4:47699 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000089411s
	
	
	==> describe nodes <==
	Name:               ha-333994
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-333994
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=904d419c46be1a7134dbdb5e29deb5c439653f86
	                    minikube.k8s.io/name=ha-333994
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_07_17T17_26_17_0700
	                    minikube.k8s.io/version=v1.33.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Wed, 17 Jul 2024 17:26:15 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-333994
	  AcquireTime:     <unset>
	  RenewTime:       Wed, 17 Jul 2024 17:54:02 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:15 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Wed, 17 Jul 2024 17:49:47 +0000   Wed, 17 Jul 2024 17:26:46 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.180
	  Hostname:    ha-333994
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164184Ki
	  pods:               110
	System Info:
	  Machine ID:                 da3e8959a305489b85ad0eed18b3234d
	  System UUID:                da3e8959-a305-489b-85ad-0eed18b3234d
	  Boot ID:                    4c5a3bea-29ed-4c23-a2f3-16d92a2e967b
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.19
	  Kubelet Version:            v1.30.2
	  Kube-Proxy Version:         v1.30.2
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-fc5497c4f-5ngfp              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 coredns-7db6d8ff4d-n4xtd             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     27m
	  kube-system                 coredns-7db6d8ff4d-sh96r             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     27m
	  kube-system                 etcd-ha-333994                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         27m
	  kube-system                 kindnet-5zksq                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      27m
	  kube-system                 kube-apiserver-ha-333994             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-controller-manager-ha-333994    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-proxy-jlzt5                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-scheduler-ha-333994             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-vip-ha-333994                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m22s
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 27m                    kube-proxy       
	  Normal  Starting                 4m20s                  kube-proxy       
	  Normal  Starting                 28m                    kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  28m (x4 over 28m)      kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    28m (x4 over 28m)      kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     28m (x3 over 28m)      kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  28m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientPID     27m                    kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeHasSufficientMemory  27m                    kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    27m                    kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeAllocatableEnforced  27m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 27m                    kubelet          Starting kubelet.
	  Normal  RegisteredNode           27m                    node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	  Normal  NodeReady                27m                    kubelet          Node ha-333994 status is now: NodeReady
	  Normal  Starting                 4m38s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  4m38s (x8 over 4m38s)  kubelet          Node ha-333994 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    4m38s (x8 over 4m38s)  kubelet          Node ha-333994 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     4m38s (x7 over 4m38s)  kubelet          Node ha-333994 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  4m38s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           4m16s                  node-controller  Node ha-333994 event: Registered Node ha-333994 in Controller
	
	
	==> dmesg <==
	[Jul17 17:49] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.050055] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.040308] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.524310] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +2.354966] systemd-fstab-generator[116]: Ignoring "noauto" option for root device
	[  +1.596488] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +7.929260] systemd-fstab-generator[758]: Ignoring "noauto" option for root device
	[  +0.058074] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.064860] systemd-fstab-generator[770]: Ignoring "noauto" option for root device
	[  +0.158074] systemd-fstab-generator[784]: Ignoring "noauto" option for root device
	[  +0.141409] systemd-fstab-generator[796]: Ignoring "noauto" option for root device
	[  +0.316481] systemd-fstab-generator[830]: Ignoring "noauto" option for root device
	[  +1.413303] systemd-fstab-generator[905]: Ignoring "noauto" option for root device
	[  +6.936615] kauditd_printk_skb: 197 callbacks suppressed
	[  +9.904333] kauditd_printk_skb: 40 callbacks suppressed
	[  +6.090710] kauditd_printk_skb: 81 callbacks suppressed
	
	
	==> etcd [41d1b53347d3ec95c0752a7b8006e52252561ffd6b0613e71f4c4d1a66d84cd1] <==
	{"level":"info","ts":"2024-07-17T17:49:40.746451Z","caller":"fileutil/purge.go:50","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/snap","suffix":"snap","max":5,"interval":"30s"}
	{"level":"info","ts":"2024-07-17T17:49:40.746545Z","caller":"fileutil/purge.go:50","msg":"started to purge file","dir":"/var/lib/minikube/etcd/member/wal","suffix":"wal","max":5,"interval":"30s"}
	{"level":"info","ts":"2024-07-17T17:49:40.747109Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 switched to configuration voters=(808613133158692504)"}
	{"level":"info","ts":"2024-07-17T17:49:40.74735Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","added-peer-id":"b38c55c42a3b698","added-peer-peer-urls":["https://192.168.39.180:2380"]}
	{"level":"info","ts":"2024-07-17T17:49:40.747698Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:49:40.747826Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:49:40.768847Z","caller":"embed/etcd.go:726","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2024-07-17T17:49:40.769611Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"b38c55c42a3b698","initial-advertise-peer-urls":["https://192.168.39.180:2380"],"listen-peer-urls":["https://192.168.39.180:2380"],"advertise-client-urls":["https://192.168.39.180:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.39.180:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2024-07-17T17:49:40.771975Z","caller":"embed/etcd.go:857","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2024-07-17T17:49:40.783644Z","caller":"embed/etcd.go:597","msg":"serving peer traffic","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:49:40.784432Z","caller":"embed/etcd.go:569","msg":"cmux::serve","address":"192.168.39.180:2380"}
	{"level":"info","ts":"2024-07-17T17:49:42.218092Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 is starting a new election at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.218153Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became pre-candidate at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.21819Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgPreVoteResp from b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:49:42.218203Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became candidate at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218304Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 received MsgVoteResp from b38c55c42a3b698 at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218487Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"b38c55c42a3b698 became leader at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.218517Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 3"}
	{"level":"info","ts":"2024-07-17T17:49:42.221374Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:49:42.221719Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:49:42.224325Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:49:42.224772Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:49:42.240735Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:49:42.240792Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:49:42.251537Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	
	
	==> etcd [5f332be219358a1962906c8879dc8340cacfe7b8d5b0e42191706a9d9285ef46] <==
	{"level":"info","ts":"2024-07-17T17:26:10.796478Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: b38c55c42a3b698 elected leader b38c55c42a3b698 at term 2"}
	{"level":"info","ts":"2024-07-17T17:26:10.801067Z","caller":"etcdserver/server.go:2068","msg":"published local member to cluster through raft","local-member-id":"b38c55c42a3b698","local-member-attributes":"{Name:ha-333994 ClientURLs:[https://192.168.39.180:2379]}","request-path":"/0/members/b38c55c42a3b698/attributes","cluster-id":"5a7d3c553a64e690","publish-timeout":"7s"}
	{"level":"info","ts":"2024-07-17T17:26:10.801194Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.801316Z","caller":"etcdserver/server.go:2578","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.806906Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.807031Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-07-17T17:26:10.812458Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.180:2379"}
	{"level":"info","ts":"2024-07-17T17:26:10.801338Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-07-17T17:26:10.817184Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"5a7d3c553a64e690","local-member-id":"b38c55c42a3b698","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817367Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.817882Z","caller":"etcdserver/server.go:2602","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2024-07-17T17:26:10.819447Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2024-07-17T17:36:11.068267Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":967}
	{"level":"info","ts":"2024-07-17T17:36:11.079164Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":967,"took":"10.209299ms","hash":2954245254,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2387968,"current-db-size-in-use":"2.4 MB"}
	{"level":"info","ts":"2024-07-17T17:36:11.079278Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":2954245254,"revision":967,"compact-revision":-1}
	{"level":"info","ts":"2024-07-17T17:39:18.346467Z","caller":"traceutil/trace.go:171","msg":"trace[2056250208] linearizableReadLoop","detail":"{readStateIndex:2015; appliedIndex:2014; }","duration":"126.865425ms","start":"2024-07-17T17:39:18.21956Z","end":"2024-07-17T17:39:18.346426Z","steps":["trace[2056250208] 'read index received'  (duration: 119.405157ms)","trace[2056250208] 'applied index is now lower than readState.Index'  (duration: 7.459705ms)"],"step_count":2}
	{"level":"warn","ts":"2024-07-17T17:39:18.346762Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"127.086437ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csidrivers/\" range_end:\"/registry/csidrivers0\" count_only:true ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-07-17T17:39:18.346812Z","caller":"traceutil/trace.go:171","msg":"trace[1825061226] range","detail":"{range_begin:/registry/csidrivers/; range_end:/registry/csidrivers0; response_count:0; response_revision:1845; }","duration":"127.262091ms","start":"2024-07-17T17:39:18.219537Z","end":"2024-07-17T17:39:18.346799Z","steps":["trace[1825061226] 'agreement among raft nodes before linearized reading'  (duration: 127.036161ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:39:18.347026Z","caller":"traceutil/trace.go:171","msg":"trace[2022994700] transaction","detail":"{read_only:false; response_revision:1845; number_of_response:1; }","duration":"148.410957ms","start":"2024-07-17T17:39:18.198608Z","end":"2024-07-17T17:39:18.347019Z","steps":["trace[2022994700] 'process raft request'  (duration: 140.398667ms)"],"step_count":1}
	{"level":"info","ts":"2024-07-17T17:41:11.077099Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":1506}
	{"level":"info","ts":"2024-07-17T17:41:11.08271Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":1506,"took":"4.803656ms","hash":4135639207,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2002944,"current-db-size-in-use":"2.0 MB"}
	{"level":"info","ts":"2024-07-17T17:41:11.082934Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":4135639207,"revision":1506,"compact-revision":967}
	{"level":"info","ts":"2024-07-17T17:46:11.088545Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":2115}
	{"level":"info","ts":"2024-07-17T17:46:11.093763Z","caller":"mvcc/kvstore_compaction.go:68","msg":"finished scheduled compaction","compact-revision":2115,"took":"4.690419ms","hash":3040853481,"current-db-size-bytes":2387968,"current-db-size":"2.4 MB","current-db-size-in-use-bytes":2105344,"current-db-size-in-use":"2.1 MB"}
	{"level":"info","ts":"2024-07-17T17:46:11.093935Z","caller":"mvcc/hash.go:137","msg":"storing new hash","hash":3040853481,"revision":2115,"compact-revision":1506}
	
	
	==> kernel <==
	 17:54:11 up 4 min,  0 users,  load average: 0.21, 0.15, 0.07
	Linux ha-333994 5.10.207 #1 SMP Tue Jul 16 20:46:02 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [dd5e8f56c4264ac3ce97606579dbb45bd1defa712cc5dfd7ef8601f279e53896] <==
	I0717 17:53:11.814652       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:11.814805       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:11.814813       1 main.go:303] handling current node
	I0717 17:53:21.810119       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:21.810198       1 main.go:303] handling current node
	I0717 17:53:21.810249       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:21.810274       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:31.817062       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:31.817224       1 main.go:303] handling current node
	I0717 17:53:31.817323       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:31.817345       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:41.816987       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:41.817045       1 main.go:303] handling current node
	I0717 17:53:41.817065       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:41.817072       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:51.809287       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:53:51.809360       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:53:51.810037       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:53:51.810079       1 main.go:303] handling current node
	I0717 17:54:01.809782       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:54:01.809814       1 main.go:303] handling current node
	I0717 17:54:01.809828       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:54:01.809833       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:54:11.813973       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:54:11.814016       1 main.go:303] handling current node
	
	
	==> kindnet [f1b88563e61d620b61da7e9c081cadd03d26d579ae84f2cad14d040ee1854428] <==
	I0717 17:46:36.593294       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:46:46.594446       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:46:46.594495       1 main.go:303] handling current node
	I0717 17:46:46.594508       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:46:46.594516       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:46:56.593210       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:46:56.593351       1 main.go:303] handling current node
	I0717 17:46:56.593473       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:46:56.593496       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:06.593427       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:06.593567       1 main.go:303] handling current node
	I0717 17:47:06.593587       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:06.593593       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:16.603181       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:16.603262       1 main.go:303] handling current node
	I0717 17:47:16.603286       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:16.603292       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:26.593294       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:26.593479       1 main.go:303] handling current node
	I0717 17:47:26.593751       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:26.593932       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	I0717 17:47:36.593175       1 main.go:299] Handling node with IPs: map[192.168.39.180:{}]
	I0717 17:47:36.593213       1 main.go:303] handling current node
	I0717 17:47:36.593235       1 main.go:299] Handling node with IPs: map[192.168.39.197:{}]
	I0717 17:47:36.593240       1 main.go:326] Node ha-333994-m03 has CIDR [10.244.1.0/24] 
	
	
	==> kube-apiserver [3c3e7888bdfe65eb452a8b1911680c8ed68a5d49a41528c6544c9bdbad54463d] <==
	I0717 17:49:43.595082       1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
	I0717 17:49:43.595111       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I0717 17:49:43.595140       1 crd_finalizer.go:266] Starting CRDFinalizer
	I0717 17:49:43.597000       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0717 17:49:43.597114       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0717 17:49:43.641418       1 shared_informer.go:320] Caches are synced for node_authorizer
	I0717 17:49:43.648238       1 shared_informer.go:320] Caches are synced for *generic.policySource[*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicy,*k8s.io/api/admissionregistration/v1.ValidatingAdmissionPolicyBinding,k8s.io/apiserver/pkg/admission/plugin/policy/validating.Validator]
	I0717 17:49:43.648665       1 policy_source.go:224] refreshing policies
	I0717 17:49:43.659841       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:49:43.676754       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0717 17:49:43.677085       1 shared_informer.go:320] Caches are synced for configmaps
	I0717 17:49:43.679683       1 apf_controller.go:379] Running API Priority and Fairness config worker
	I0717 17:49:43.679810       1 apf_controller.go:382] Running API Priority and Fairness periodic rebalancing process
	I0717 17:49:43.682669       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0717 17:49:43.686464       1 shared_informer.go:320] Caches are synced for cluster_authentication_trust_controller
	I0717 17:49:43.688086       1 handler_discovery.go:447] Starting ResourceDiscoveryManager
	E0717 17:49:43.689041       1 controller.go:97] Error removing old endpoints from kubernetes service: no API server IP addresses were listed in storage, refusing to erase all endpoints for the kubernetes Service
	I0717 17:49:43.691390       1 shared_informer.go:320] Caches are synced for crd-autoregister
	I0717 17:49:43.692086       1 aggregator.go:165] initial CRD sync complete...
	I0717 17:49:43.692210       1 autoregister_controller.go:141] Starting autoregister controller
	I0717 17:49:43.692231       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0717 17:49:43.692323       1 cache.go:39] Caches are synced for autoregister controller
	I0717 17:49:44.589738       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:49:55.907406       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:49:56.140322       1 controller.go:615] quota admission added evaluator for: endpoints
	
	
	==> kube-apiserver [d3a0374a88e2c013e134eec1052b56a531aae862faa0eb5bb6e6411c1d40d411] <==
	E0717 17:26:12.663111       1 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-system\" not found" interval="200ms"
	E0717 17:26:12.683423       1 controller.go:145] while syncing ConfigMap "kube-system/kube-apiserver-legacy-service-account-token-tracking", err: namespaces "kube-system" not found
	I0717 17:26:12.731655       1 controller.go:615] quota admission added evaluator for: namespaces
	I0717 17:26:12.867696       1 controller.go:615] quota admission added evaluator for: leases.coordination.k8s.io
	I0717 17:26:13.519087       1 storage_scheduling.go:95] created PriorityClass system-node-critical with value 2000001000
	I0717 17:26:13.524933       1 storage_scheduling.go:95] created PriorityClass system-cluster-critical with value 2000000000
	I0717 17:26:13.525042       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0717 17:26:14.141166       1 controller.go:615] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0717 17:26:14.190199       1 controller.go:615] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0717 17:26:14.346951       1 alloc.go:330] "allocated clusterIPs" service="default/kubernetes" clusterIPs={"IPv4":"10.96.0.1"}
	W0717 17:26:14.355637       1 lease.go:265] Resetting endpoints for master service "kubernetes" to [192.168.39.180]
	I0717 17:26:14.357063       1 controller.go:615] quota admission added evaluator for: endpoints
	I0717 17:26:14.363079       1 controller.go:615] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0717 17:26:14.550932       1 controller.go:615] quota admission added evaluator for: serviceaccounts
	I0717 17:26:16.299323       1 controller.go:615] quota admission added evaluator for: deployments.apps
	I0717 17:26:16.313650       1 alloc.go:330] "allocated clusterIPs" service="kube-system/kube-dns" clusterIPs={"IPv4":"10.96.0.10"}
	I0717 17:26:16.444752       1 controller.go:615] quota admission added evaluator for: daemonsets.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.574426       1 controller.go:615] quota admission added evaluator for: controllerrevisions.apps
	I0717 17:26:29.724582       1 controller.go:615] quota admission added evaluator for: replicasets.apps
	E0717 17:38:36.696311       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53884: use of closed network connection
	E0717 17:38:37.099896       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:53968: use of closed network connection
	E0717 17:38:37.471315       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:54040: use of closed network connection
	E0717 17:38:39.884607       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45970: use of closed network connection
	E0717 17:38:40.043702       1 conn.go:339] Error on socket receive: read tcp 192.168.39.254:8443->192.168.39.1:45990: use of closed network connection
	
	
	==> kube-controller-manager [38a3e6e69ce36e4718f7597a891505e74d497b2ce82217fdebe3363666ea32f6] <==
	I0717 17:49:55.953819       1 shared_informer.go:320] Caches are synced for stateful set
	I0717 17:49:55.969497       1 shared_informer.go:320] Caches are synced for disruption
	I0717 17:49:55.969720       1 shared_informer.go:320] Caches are synced for daemon sets
	I0717 17:49:55.989955       1 shared_informer.go:320] Caches are synced for crt configmap
	I0717 17:49:55.995325       1 shared_informer.go:320] Caches are synced for taint
	I0717 17:49:55.995861       1 node_lifecycle_controller.go:1227] "Initializing eviction metric for zone" logger="node-lifecycle-controller" zone=""
	I0717 17:49:56.008684       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994"
	I0717 17:49:56.009020       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:49:56.009215       1 node_lifecycle_controller.go:1073] "Controller detected that zone is now in new state" logger="node-lifecycle-controller" zone="" newState="Normal"
	I0717 17:49:56.107028       1 shared_informer.go:320] Caches are synced for endpoint_slice_mirroring
	I0717 17:49:56.125129       1 shared_informer.go:320] Caches are synced for HPA
	I0717 17:49:56.130984       1 shared_informer.go:320] Caches are synced for endpoint
	I0717 17:49:56.150989       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:49:56.160240       1 shared_informer.go:320] Caches are synced for resource quota
	I0717 17:49:56.545417       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:49:56.545744       1 garbagecollector.go:157] "All resource monitors have synced. Proceeding to collect garbage" logger="garbage-collector-controller"
	I0717 17:49:56.607585       1 shared_informer.go:320] Caches are synced for garbage collector
	I0717 17:50:29.652302       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="17.989423ms"
	I0717 17:50:29.652927       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="154.343µs"
	I0717 17:50:29.673006       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="10.432657ms"
	I0717 17:50:29.674427       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="35.074µs"
	I0717 17:54:00.096330       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="23.157048ms"
	I0717 17:54:00.103117       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.620259ms"
	I0717 17:54:00.103395       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="156.252µs"
	I0717 17:54:00.105615       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="47.541µs"
	
	
	==> kube-controller-manager [515c5ff9f46dae1a0befd8efb5eb62b1d7d5a8d9ab3d2489e5d77225c2969697] <==
	I0717 17:26:46.721053       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="84.491µs"
	I0717 17:26:47.592898       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="95.998µs"
	I0717 17:26:47.650175       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="48.942µs"
	I0717 17:26:48.607906       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.62659ms"
	I0717 17:26:48.608008       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="53.426µs"
	I0717 17:26:48.647797       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="14.456738ms"
	I0717 17:26:48.648394       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="kube-system/coredns-7db6d8ff4d" duration="67.436µs"
	I0717 17:26:49.026935       1 node_lifecycle_controller.go:1050] "Controller detected that some Nodes are Ready. Exiting master disruption mode" logger="node-lifecycle-controller"
	I0717 17:27:16.243497       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="56.504603ms"
	I0717 17:27:16.262527       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="18.940756ms"
	I0717 17:27:16.263000       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="73.787µs"
	I0717 17:27:16.274690       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="36.512µs"
	I0717 17:27:19.665105       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="6.033144ms"
	I0717 17:27:19.665529       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="92.848µs"
	I0717 17:40:15.410809       1 actual_state_of_world.go:543] "Failed to update statusUpdateNeeded field in actual state of world" logger="persistentvolume-attach-detach-controller" err="Failed to set statusUpdateNeeded to needed true, because nodeName=\"ha-333994-m03\" does not exist"
	I0717 17:40:15.440785       1 range_allocator.go:381] "Set node PodCIDR" logger="node-ipam-controller" node="ha-333994-m03" podCIDRs=["10.244.1.0/24"]
	I0717 17:40:19.153891       1 node_lifecycle_controller.go:879] "Missing timestamp for Node. Assuming now as a timestamp" logger="node-lifecycle-controller" node="ha-333994-m03"
	I0717 17:40:34.584196       1 topologycache.go:237] "Can't get CPU or zone information for node" logger="endpointslice-controller" node="ha-333994-m03"
	I0717 17:40:34.610758       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="117.829µs"
	I0717 17:40:34.611099       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="136.33µs"
	I0717 17:40:34.627517       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="41.973µs"
	I0717 17:40:38.439768       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="13.993456ms"
	I0717 17:40:38.440397       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="128.876µs"
	I0717 17:46:44.300951       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="17.533645ms"
	I0717 17:46:44.302036       1 replica_set.go:676] "Finished syncing" logger="replicaset-controller" kind="ReplicaSet" key="default/busybox-fc5497c4f" duration="47.71µs"
	
	
	==> kube-proxy [0a2a73f6200a3c41f2559944af1b8896b01ccd3f6fa5ac3a4d66a7ec20085f45] <==
	I0717 17:26:30.633390       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:26:30.664296       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:26:30.777855       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:26:30.777915       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:26:30.777933       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:26:30.782913       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:26:30.783727       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:26:30.783743       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:26:30.785883       1 config.go:192] "Starting service config controller"
	I0717 17:26:30.786104       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:26:30.786184       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:26:30.786194       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:26:30.786196       1 config.go:319] "Starting node config controller"
	I0717 17:26:30.786202       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:26:30.886459       1 shared_informer.go:320] Caches are synced for endpoint slice config
	I0717 17:26:30.886517       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:26:30.886527       1 shared_informer.go:320] Caches are synced for service config
	
	
	==> kube-proxy [cede48d48fe274c1e899c0bd8bea598571a7def0a52e5e2bade595ef4f553fef] <==
	I0717 17:49:50.697431       1 server_linux.go:69] "Using iptables proxy"
	I0717 17:49:50.728033       1 server.go:1062] "Successfully retrieved node IP(s)" IPs=["192.168.39.180"]
	I0717 17:49:50.773252       1 server_linux.go:143] "No iptables support for family" ipFamily="IPv6"
	I0717 17:49:50.773306       1 server.go:661] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0717 17:49:50.773323       1 server_linux.go:165] "Using iptables Proxier"
	I0717 17:49:50.776016       1 proxier.go:243] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0717 17:49:50.776460       1 server.go:872] "Version info" version="v1.30.2"
	I0717 17:49:50.776490       1 server.go:874] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:49:50.778529       1 config.go:192] "Starting service config controller"
	I0717 17:49:50.778847       1 shared_informer.go:313] Waiting for caches to sync for service config
	I0717 17:49:50.778963       1 config.go:101] "Starting endpoint slice config controller"
	I0717 17:49:50.779098       1 shared_informer.go:313] Waiting for caches to sync for endpoint slice config
	I0717 17:49:50.780341       1 config.go:319] "Starting node config controller"
	I0717 17:49:50.780372       1 shared_informer.go:313] Waiting for caches to sync for node config
	I0717 17:49:50.880389       1 shared_informer.go:320] Caches are synced for service config
	I0717 17:49:50.880465       1 shared_informer.go:320] Caches are synced for node config
	I0717 17:49:50.880915       1 shared_informer.go:320] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [2f62c96e1a7844ed21d49b39ee23ef0aefd932e9d5a3ac7a78f787779864806c] <==
	E0717 17:26:12.612716       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:12.612322       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0717 17:26:12.612328       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612334       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612341       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:12.612951       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0717 17:26:13.435639       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0717 17:26:13.435693       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0717 17:26:13.453973       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0717 17:26:13.454017       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0717 17:26:13.542464       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.542509       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.613338       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0717 17:26:13.613487       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0717 17:26:13.619979       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.620074       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0717 17:26:13.625523       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0717 17:26:13.625659       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0717 17:26:13.773180       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0717 17:26:13.773245       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0717 17:26:13.789228       1 reflector.go:547] runtime/asm_amd64.s:1695: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0717 17:26:13.789279       1 reflector.go:150] runtime/asm_amd64.s:1695: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0717 17:26:13.882287       1 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0717 17:26:13.882339       1 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	I0717 17:26:16.586108       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kube-scheduler [7f7ede089f3e73228764b3c542d044e8dfb371908879f2d014d0b3cb56b61a60] <==
	I0717 17:49:41.818392       1 serving.go:380] Generated self-signed cert in-memory
	I0717 17:49:43.698181       1 server.go:154] "Starting Kubernetes Scheduler" version="v1.30.2"
	I0717 17:49:43.698222       1 server.go:156] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0717 17:49:43.704731       1 secure_serving.go:213] Serving securely on 127.0.0.1:10259
	I0717 17:49:43.704960       1 requestheader_controller.go:169] Starting RequestHeaderAuthRequestController
	I0717 17:49:43.705003       1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController
	I0717 17:49:43.705055       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0717 17:49:43.708667       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0717 17:49:43.708702       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0717 17:49:43.708715       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file"
	I0717 17:49:43.708721       1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
	I0717 17:49:43.805438       1 shared_informer.go:320] Caches are synced for RequestHeaderAuthRequestController
	I0717 17:49:43.809697       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
	I0717 17:49:43.809823       1 shared_informer.go:320] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Jul 17 17:50:20 ha-333994 kubelet[912]: I0717 17:50:20.667533     912 scope.go:117] "RemoveContainer" containerID="86b483ab22e1a88b745f12d55b1fa66f91f47882547e5407707e50180e29df21"
	Jul 17 17:50:20 ha-333994 kubelet[912]: I0717 17:50:20.668345     912 scope.go:117] "RemoveContainer" containerID="603ad8840c52684184d18957755dbefa293c0f1b45c847cd88296b580d9ac18f"
	Jul 17 17:50:20 ha-333994 kubelet[912]: E0717 17:50:20.668770     912 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-provisioner\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-provisioner pod=storage-provisioner_kube-system(123c311b-67ed-42b2-ad53-cc59077dfbe7)\"" pod="kube-system/storage-provisioner" podUID="123c311b-67ed-42b2-ad53-cc59077dfbe7"
	Jul 17 17:50:33 ha-333994 kubelet[912]: I0717 17:50:33.312537     912 scope.go:117] "RemoveContainer" containerID="2030e6caab488650f28c0420e472e5dc02b9197bfb6300d22856d4ccb76ed29d"
	Jul 17 17:50:33 ha-333994 kubelet[912]: E0717 17:50:33.409447     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:50:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:50:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:50:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:50:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:50:36 ha-333994 kubelet[912]: I0717 17:50:36.384656     912 scope.go:117] "RemoveContainer" containerID="603ad8840c52684184d18957755dbefa293c0f1b45c847cd88296b580d9ac18f"
	Jul 17 17:51:33 ha-333994 kubelet[912]: E0717 17:51:33.410923     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:51:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:51:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:51:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:51:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:52:33 ha-333994 kubelet[912]: E0717 17:52:33.411201     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:52:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:52:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:52:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:52:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Jul 17 17:53:33 ha-333994 kubelet[912]: E0717 17:53:33.409498     912 iptables.go:577] "Could not set up iptables canary" err=<
	Jul 17 17:53:33 ha-333994 kubelet[912]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Jul 17 17:53:33 ha-333994 kubelet[912]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Jul 17 17:53:33 ha-333994 kubelet[912]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Jul 17 17:53:33 ha-333994 kubelet[912]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-333994 -n ha-333994
helpers_test.go:261: (dbg) Run:  kubectl --context ha-333994 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-fc5497c4f-djvz6 busybox-fc5497c4f-gtghn
helpers_test.go:274: ======> post-mortem[TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6 busybox-fc5497c4f-gtghn
helpers_test.go:282: (dbg) kubectl --context ha-333994 describe pod busybox-fc5497c4f-djvz6 busybox-fc5497c4f-gtghn:

                                                
                                                
-- stdout --
	Name:             busybox-fc5497c4f-djvz6
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-59849 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-59849:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age                  From               Message
	  ----     ------            ----                 ----               -------
	  Warning  FailedScheduling  4m29s                default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.
	  Warning  FailedScheduling  4m23s                default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.
	  Warning  FailedScheduling  16m (x3 over 26m)    default-scheduler  0/1 nodes are available: 1 node(s) didn't match pod anti-affinity rules. preemption: 0/1 nodes are available: 1 No preemption victims found for incoming pod.
	  Warning  FailedScheduling  8m26s (x3 over 13m)  default-scheduler  0/2 nodes are available: 2 node(s) didn't match pod anti-affinity rules. preemption: 0/2 nodes are available: 2 No preemption victims found for incoming pod.
	
	
	Name:             busybox-fc5497c4f-gtghn
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=fc5497c4f
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-fc5497c4f
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-lfmtp (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-lfmtp:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age   From               Message
	  ----     ------            ----  ----               -------
	  Warning  FailedScheduling  12s   default-scheduler  0/2 nodes are available: 1 node(s) didn't match pod anti-affinity rules, 1 node(s) were unschedulable. preemption: 0/2 nodes are available: 1 No preemption victims found for incoming pod, 1 Preemption is not helpful for scheduling.

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMultiControlPlane/serial/DegradedAfterSecondaryNodeDelete (2.91s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopCluster (102.77s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopCluster
ha_test.go:531: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 stop -v=7 --alsologtostderr
E0717 17:54:41.797418   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
ha_test.go:531: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 stop -v=7 --alsologtostderr: signal: killed (1m24.34433418s)

                                                
                                                
-- stdout --
	* Stopping node "ha-333994-m02"  ...
	* Stopping node "ha-333994"  ...

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:54:13.032871   41488 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:54:13.033009   41488 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:54:13.033018   41488 out.go:304] Setting ErrFile to fd 2...
	I0717 17:54:13.033025   41488 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:54:13.033223   41488 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:54:13.033448   41488 out.go:298] Setting JSON to false
	I0717 17:54:13.033541   41488 mustload.go:65] Loading cluster: ha-333994
	I0717 17:54:13.033895   41488 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:54:13.033998   41488 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/ha-333994/config.json ...
	I0717 17:54:13.034214   41488 mustload.go:65] Loading cluster: ha-333994
	I0717 17:54:13.034367   41488 config.go:182] Loaded profile config "ha-333994": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:54:13.034396   41488 stop.go:39] StopHost: ha-333994-m02
	I0717 17:54:13.034764   41488 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:13.034816   41488 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:13.049510   41488 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37663
	I0717 17:54:13.049894   41488 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:13.050423   41488 main.go:141] libmachine: Using API Version  1
	I0717 17:54:13.050442   41488 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:13.050832   41488 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:13.053246   41488 out.go:177] * Stopping node "ha-333994-m02"  ...
	I0717 17:54:13.054900   41488 machine.go:157] backing up vm config to /var/lib/minikube/backup: [/etc/cni /etc/kubernetes]
	I0717 17:54:13.054926   41488 main.go:141] libmachine: (ha-333994-m02) Calling .DriverName
	I0717 17:54:13.055159   41488 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/backup
	I0717 17:54:13.055186   41488 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHHostname
	I0717 17:54:13.057928   41488 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:54:13.058371   41488 main.go:141] libmachine: (ha-333994-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:b1:0f:81", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:44 +0000 UTC Type:0 Mac:52:54:00:b1:0f:81 Iaid: IPaddr:192.168.39.127 Prefix:24 Hostname:ha-333994-m02 Clientid:01:52:54:00:b1:0f:81}
	I0717 17:54:13.058401   41488 main.go:141] libmachine: (ha-333994-m02) DBG | domain ha-333994-m02 has defined IP address 192.168.39.127 and MAC address 52:54:00:b1:0f:81 in network mk-ha-333994
	I0717 17:54:13.058539   41488 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHPort
	I0717 17:54:13.058724   41488 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHKeyPath
	I0717 17:54:13.058867   41488 main.go:141] libmachine: (ha-333994-m02) Calling .GetSSHUsername
	I0717 17:54:13.058978   41488 sshutil.go:53] new ssh client: &{IP:192.168.39.127 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994-m02/id_rsa Username:docker}
	I0717 17:54:13.147208   41488 ssh_runner.go:195] Run: sudo rsync --archive --relative /etc/cni /var/lib/minikube/backup
	I0717 17:54:13.199909   41488 ssh_runner.go:195] Run: sudo rsync --archive --relative /etc/kubernetes /var/lib/minikube/backup
	I0717 17:54:13.256547   41488 main.go:141] libmachine: Stopping "ha-333994-m02"...
	I0717 17:54:13.256591   41488 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:54:13.258037   41488 main.go:141] libmachine: (ha-333994-m02) Calling .Stop
	I0717 17:54:13.261429   41488 main.go:141] libmachine: (ha-333994-m02) Waiting for machine to stop 0/120
	I0717 17:54:14.262958   41488 main.go:141] libmachine: (ha-333994-m02) Calling .GetState
	I0717 17:54:14.264147   41488 main.go:141] libmachine: Machine "ha-333994-m02" was stopped.
	I0717 17:54:14.264164   41488 stop.go:75] duration metric: took 1.209264042s to stop
	I0717 17:54:14.264187   41488 stop.go:39] StopHost: ha-333994
	I0717 17:54:14.264503   41488 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:54:14.264557   41488 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:54:14.279030   41488 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35843
	I0717 17:54:14.279438   41488 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:54:14.279941   41488 main.go:141] libmachine: Using API Version  1
	I0717 17:54:14.279965   41488 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:54:14.280234   41488 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:54:14.282422   41488 out.go:177] * Stopping node "ha-333994"  ...
	I0717 17:54:14.283699   41488 machine.go:157] backing up vm config to /var/lib/minikube/backup: [/etc/cni /etc/kubernetes]
	I0717 17:54:14.283718   41488 main.go:141] libmachine: (ha-333994) Calling .DriverName
	I0717 17:54:14.283941   41488 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/backup
	I0717 17:54:14.283967   41488 main.go:141] libmachine: (ha-333994) Calling .GetSSHHostname
	I0717 17:54:14.286500   41488 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:54:14.286884   41488 main.go:141] libmachine: (ha-333994) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:73:4b:68", ip: ""} in network mk-ha-333994: {Iface:virbr1 ExpiryTime:2024-07-17 18:49:21 +0000 UTC Type:0 Mac:52:54:00:73:4b:68 Iaid: IPaddr:192.168.39.180 Prefix:24 Hostname:ha-333994 Clientid:01:52:54:00:73:4b:68}
	I0717 17:54:14.286921   41488 main.go:141] libmachine: (ha-333994) DBG | domain ha-333994 has defined IP address 192.168.39.180 and MAC address 52:54:00:73:4b:68 in network mk-ha-333994
	I0717 17:54:14.287004   41488 main.go:141] libmachine: (ha-333994) Calling .GetSSHPort
	I0717 17:54:14.287242   41488 main.go:141] libmachine: (ha-333994) Calling .GetSSHKeyPath
	I0717 17:54:14.287405   41488 main.go:141] libmachine: (ha-333994) Calling .GetSSHUsername
	I0717 17:54:14.287573   41488 sshutil.go:53] new ssh client: &{IP:192.168.39.180 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/ha-333994/id_rsa Username:docker}
	I0717 17:54:14.372493   41488 ssh_runner.go:195] Run: sudo rsync --archive --relative /etc/cni /var/lib/minikube/backup
	I0717 17:54:14.425565   41488 ssh_runner.go:195] Run: sudo rsync --archive --relative /etc/kubernetes /var/lib/minikube/backup
	I0717 17:54:14.478589   41488 main.go:141] libmachine: Stopping "ha-333994"...
	I0717 17:54:14.478610   41488 main.go:141] libmachine: (ha-333994) Calling .GetState
	I0717 17:54:14.480107   41488 main.go:141] libmachine: (ha-333994) Calling .Stop
	I0717 17:54:14.483331   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 0/120
	I0717 17:54:15.484719   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 1/120
	I0717 17:54:16.485972   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 2/120
	I0717 17:54:17.487143   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 3/120
	I0717 17:54:18.488396   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 4/120
	I0717 17:54:19.490359   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 5/120
	I0717 17:54:20.492394   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 6/120
	I0717 17:54:21.493819   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 7/120
	I0717 17:54:22.495243   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 8/120
	I0717 17:54:23.496753   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 9/120
	I0717 17:54:24.498820   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 10/120
	I0717 17:54:25.500076   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 11/120
	I0717 17:54:26.501445   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 12/120
	I0717 17:54:27.502931   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 13/120
	I0717 17:54:28.504213   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 14/120
	I0717 17:54:29.505766   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 15/120
	I0717 17:54:30.507477   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 16/120
	I0717 17:54:31.508970   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 17/120
	I0717 17:54:32.510569   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 18/120
	I0717 17:54:33.512042   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 19/120
	I0717 17:54:34.513894   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 20/120
	I0717 17:54:35.515436   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 21/120
	I0717 17:54:36.516724   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 22/120
	I0717 17:54:37.518029   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 23/120
	I0717 17:54:38.519522   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 24/120
	I0717 17:54:39.521181   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 25/120
	I0717 17:54:40.522405   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 26/120
	I0717 17:54:41.523672   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 27/120
	I0717 17:54:42.524968   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 28/120
	I0717 17:54:43.526310   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 29/120
	I0717 17:54:44.528315   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 30/120
	I0717 17:54:45.529621   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 31/120
	I0717 17:54:46.530932   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 32/120
	I0717 17:54:47.532359   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 33/120
	I0717 17:54:48.533647   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 34/120
	I0717 17:54:49.535513   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 35/120
	I0717 17:54:50.536788   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 36/120
	I0717 17:54:51.538108   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 37/120
	I0717 17:54:52.539470   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 38/120
	I0717 17:54:53.540780   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 39/120
	I0717 17:54:54.542625   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 40/120
	I0717 17:54:55.543883   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 41/120
	I0717 17:54:56.545217   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 42/120
	I0717 17:54:57.546545   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 43/120
	I0717 17:54:58.548034   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 44/120
	I0717 17:54:59.549872   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 45/120
	I0717 17:55:00.551209   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 46/120
	I0717 17:55:01.552575   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 47/120
	I0717 17:55:02.553942   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 48/120
	I0717 17:55:03.555344   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 49/120
	I0717 17:55:04.557295   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 50/120
	I0717 17:55:05.558558   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 51/120
	I0717 17:55:06.560493   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 52/120
	I0717 17:55:07.561787   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 53/120
	I0717 17:55:08.563112   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 54/120
	I0717 17:55:09.564833   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 55/120
	I0717 17:55:10.566075   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 56/120
	I0717 17:55:11.567431   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 57/120
	I0717 17:55:12.568683   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 58/120
	I0717 17:55:13.570045   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 59/120
	I0717 17:55:14.571765   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 60/120
	I0717 17:55:15.572997   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 61/120
	I0717 17:55:16.574356   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 62/120
	I0717 17:55:17.575810   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 63/120
	I0717 17:55:18.577270   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 64/120
	I0717 17:55:19.578925   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 65/120
	I0717 17:55:20.580196   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 66/120
	I0717 17:55:21.581497   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 67/120
	I0717 17:55:22.582906   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 68/120
	I0717 17:55:23.584241   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 69/120
	I0717 17:55:24.586110   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 70/120
	I0717 17:55:25.587676   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 71/120
	I0717 17:55:26.589025   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 72/120
	I0717 17:55:27.590525   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 73/120
	I0717 17:55:28.591892   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 74/120
	I0717 17:55:29.593598   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 75/120
	I0717 17:55:30.595012   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 76/120
	I0717 17:55:31.596376   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 77/120
	I0717 17:55:32.597890   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 78/120
	I0717 17:55:33.599173   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 79/120
	I0717 17:55:34.600932   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 80/120
	I0717 17:55:35.602445   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 81/120
	I0717 17:55:36.604028   41488 main.go:141] libmachine: (ha-333994) Waiting for machine to stop 82/120

                                                
                                                
** /stderr **
ha_test.go:533: failed to stop cluster. args "out/minikube-linux-amd64 -p ha-333994 stop -v=7 --alsologtostderr": signal: killed
ha_test.go:537: (dbg) Run:  out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr
ha_test.go:537: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr: context deadline exceeded (1.575µs)
ha_test.go:540: failed to run minikube status. args "out/minikube-linux-amd64 -p ha-333994 status -v=7 --alsologtostderr" : context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994
E0717 17:55:55.189150   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p ha-333994 -n ha-333994: exit status 3 (18.42085029s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0717 17:55:55.758490   41809 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.168.39.180:22: connect: no route to host
	E0717 17:55:55.758507   41809 status.go:249] status error: NewSession: new client: new client: dial tcp 192.168.39.180:22: connect: no route to host

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "ha-333994" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMultiControlPlane/serial/StopCluster (102.77s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (125.83s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-linux-amd64 profile list
no_kubernetes_test.go:169: (dbg) Non-zero exit: out/minikube-linux-amd64 profile list: signal: killed (2m5.602256673s)
no_kubernetes_test.go:171: Profile list failed : "out/minikube-linux-amd64 profile list" : signal: killed
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p NoKubernetes-971212 -n NoKubernetes-971212
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p NoKubernetes-971212 -n NoKubernetes-971212: exit status 6 (223.293744ms)

                                                
                                                
-- stdout --
	Running
	WARNING: Your kubectl is pointing to stale minikube-vm.
	To fix the kubectl context, run `minikube update-context`

                                                
                                                
-- /stdout --
** stderr ** 
	E0717 18:27:32.129585   61402 status.go:417] kubeconfig endpoint: get endpoint: "NoKubernetes-971212" does not appear in /home/jenkins/minikube-integration/19283-14409/kubeconfig

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 6 (may be ok)
helpers_test.go:241: "NoKubernetes-971212" host is not running, skipping log retrieval (state="Running\nWARNING: Your kubectl is pointing to stale minikube-vm.\nTo fix the kubectl context, run `minikube update-context`")
--- FAIL: TestNoKubernetes/serial/ProfileList (125.83s)

                                                
                                    

Test pass (273/327)

Order passed test Duration
3 TestDownloadOnly/v1.20.0/json-events 45.22
4 TestDownloadOnly/v1.20.0/preload-exists 0
8 TestDownloadOnly/v1.20.0/LogsDuration 0.06
9 TestDownloadOnly/v1.20.0/DeleteAll 0.13
10 TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds 0.12
12 TestDownloadOnly/v1.30.2/json-events 24.98
13 TestDownloadOnly/v1.30.2/preload-exists 0
17 TestDownloadOnly/v1.30.2/LogsDuration 0.06
18 TestDownloadOnly/v1.30.2/DeleteAll 0.13
19 TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds 0.12
21 TestDownloadOnly/v1.31.0-beta.0/json-events 28.44
22 TestDownloadOnly/v1.31.0-beta.0/preload-exists 0
26 TestDownloadOnly/v1.31.0-beta.0/LogsDuration 0.06
27 TestDownloadOnly/v1.31.0-beta.0/DeleteAll 0.13
28 TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds 0.12
30 TestBinaryMirror 0.55
31 TestOffline 86.97
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.05
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.05
36 TestAddons/Setup 271.13
38 TestAddons/parallel/Registry 19.86
39 TestAddons/parallel/Ingress 21.2
40 TestAddons/parallel/InspektorGadget 10.86
41 TestAddons/parallel/MetricsServer 5.8
42 TestAddons/parallel/HelmTiller 16.67
44 TestAddons/parallel/CSI 53.36
45 TestAddons/parallel/Headlamp 16.99
46 TestAddons/parallel/CloudSpanner 5.54
47 TestAddons/parallel/LocalPath 17.13
48 TestAddons/parallel/NvidiaDevicePlugin 6.68
49 TestAddons/parallel/Yakd 6.01
50 TestAddons/parallel/Volcano 38.7
53 TestAddons/serial/GCPAuth/Namespaces 0.13
54 TestAddons/StoppedEnableDisable 92.65
55 TestCertOptions 51.09
56 TestCertExpiration 258.04
58 TestForceSystemdFlag 52.87
59 TestForceSystemdEnv 47.52
61 TestKVMDriverInstallOrUpdate 22.75
65 TestErrorSpam/setup 40.47
66 TestErrorSpam/start 0.32
67 TestErrorSpam/status 0.71
68 TestErrorSpam/pause 1.5
69 TestErrorSpam/unpause 1.52
70 TestErrorSpam/stop 5.03
73 TestFunctional/serial/CopySyncFile 0
74 TestFunctional/serial/StartWithProxy 83.77
75 TestFunctional/serial/AuditLog 0
76 TestFunctional/serial/SoftStart 42.2
77 TestFunctional/serial/KubeContext 0.04
78 TestFunctional/serial/KubectlGetPods 0.07
81 TestFunctional/serial/CacheCmd/cache/add_remote 3.48
82 TestFunctional/serial/CacheCmd/cache/add_local 2.95
83 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.04
84 TestFunctional/serial/CacheCmd/cache/list 0.04
85 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.22
86 TestFunctional/serial/CacheCmd/cache/cache_reload 1.71
87 TestFunctional/serial/CacheCmd/cache/delete 0.09
88 TestFunctional/serial/MinikubeKubectlCmd 0.1
89 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.1
90 TestFunctional/serial/ExtraConfig 44.52
91 TestFunctional/serial/ComponentHealth 0.06
92 TestFunctional/serial/LogsCmd 1.38
93 TestFunctional/serial/LogsFileCmd 1.41
94 TestFunctional/serial/InvalidService 4.23
96 TestFunctional/parallel/ConfigCmd 0.3
97 TestFunctional/parallel/DashboardCmd 20.69
98 TestFunctional/parallel/DryRun 0.31
99 TestFunctional/parallel/InternationalLanguage 0.15
100 TestFunctional/parallel/StatusCmd 0.89
104 TestFunctional/parallel/ServiceCmdConnect 8.56
105 TestFunctional/parallel/AddonsCmd 0.12
106 TestFunctional/parallel/PersistentVolumeClaim 39.98
108 TestFunctional/parallel/SSHCmd 0.42
109 TestFunctional/parallel/CpCmd 1.31
110 TestFunctional/parallel/MySQL 34.02
111 TestFunctional/parallel/FileSync 0.23
112 TestFunctional/parallel/CertSync 1.52
116 TestFunctional/parallel/NodeLabels 0.07
118 TestFunctional/parallel/NonActiveRuntimeDisabled 0.56
120 TestFunctional/parallel/License 0.96
121 TestFunctional/parallel/ServiceCmd/DeployApp 10.21
122 TestFunctional/parallel/ProfileCmd/profile_not_create 0.32
123 TestFunctional/parallel/ProfileCmd/profile_list 0.29
124 TestFunctional/parallel/ProfileCmd/profile_json_output 0.31
125 TestFunctional/parallel/MountCmd/any-port 9.6
126 TestFunctional/parallel/UpdateContextCmd/no_changes 0.09
127 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.09
128 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.09
129 TestFunctional/parallel/ImageCommands/ImageListShort 0.25
130 TestFunctional/parallel/ImageCommands/ImageListTable 0.26
131 TestFunctional/parallel/ImageCommands/ImageListJson 0.24
132 TestFunctional/parallel/ImageCommands/ImageListYaml 0.29
133 TestFunctional/parallel/ImageCommands/ImageBuild 4.68
134 TestFunctional/parallel/ImageCommands/Setup 2.75
135 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 1.73
136 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 1.06
137 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 2.62
138 TestFunctional/parallel/ServiceCmd/List 0.45
139 TestFunctional/parallel/MountCmd/specific-port 1.68
140 TestFunctional/parallel/ServiceCmd/JSONOutput 0.49
141 TestFunctional/parallel/ServiceCmd/HTTPS 0.39
142 TestFunctional/parallel/ServiceCmd/Format 0.3
143 TestFunctional/parallel/ImageCommands/ImageSaveToFile 0.99
144 TestFunctional/parallel/ServiceCmd/URL 0.46
145 TestFunctional/parallel/MountCmd/VerifyCleanup 1.68
146 TestFunctional/parallel/Version/short 0.05
147 TestFunctional/parallel/Version/components 0.64
148 TestFunctional/parallel/ImageCommands/ImageRemove 0.51
149 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.01
150 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 0.68
160 TestFunctional/delete_echo-server_images 0.03
161 TestFunctional/delete_my-image_image 0.02
162 TestFunctional/delete_minikube_cached_images 0.01
170 TestMultiControlPlane/serial/NodeLabels 0.06
184 TestJSONOutput/start/Command 96.7
185 TestJSONOutput/start/Audit 0
187 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
188 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
190 TestJSONOutput/pause/Command 0.7
191 TestJSONOutput/pause/Audit 0
193 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
194 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
196 TestJSONOutput/unpause/Command 0.63
197 TestJSONOutput/unpause/Audit 0
199 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
200 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
202 TestJSONOutput/stop/Command 6.5
203 TestJSONOutput/stop/Audit 0
205 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
206 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
207 TestErrorJSONOutput 0.18
212 TestMainNoArgs 0.04
213 TestMinikubeProfile 91.81
216 TestMountStart/serial/StartWithMountFirst 27.8
217 TestMountStart/serial/VerifyMountFirst 0.37
218 TestMountStart/serial/StartWithMountSecond 32.04
219 TestMountStart/serial/VerifyMountSecond 0.35
220 TestMountStart/serial/DeleteFirst 0.68
221 TestMountStart/serial/VerifyMountPostDelete 0.35
222 TestMountStart/serial/Stop 1.58
223 TestMountStart/serial/RestartStopped 22.57
224 TestMountStart/serial/VerifyMountPostStop 0.36
227 TestMultiNode/serial/FreshStart2Nodes 124.42
228 TestMultiNode/serial/DeployApp2Nodes 5.82
229 TestMultiNode/serial/PingHostFrom2Pods 0.75
230 TestMultiNode/serial/AddNode 52.29
231 TestMultiNode/serial/MultiNodeLabels 0.06
232 TestMultiNode/serial/ProfileList 0.2
233 TestMultiNode/serial/CopyFile 6.83
234 TestMultiNode/serial/StopNode 2.13
235 TestMultiNode/serial/StartAfterStop 34.66
236 TestMultiNode/serial/RestartKeepsNodes 332.68
237 TestMultiNode/serial/DeleteNode 2.1
238 TestMultiNode/serial/StopMultiNode 183.08
239 TestMultiNode/serial/RestartMultiNode 107.29
240 TestMultiNode/serial/ValidateNameConflict 44.01
245 TestPreload 295.27
247 TestScheduledStopUnix 118.46
251 TestRunningBinaryUpgrade 177.14
253 TestKubernetesUpgrade 138.95
256 TestNoKubernetes/serial/StartNoK8sWithVersion 0.08
265 TestStartStop/group/old-k8s-version/serial/FirstStart 159.71
266 TestNoKubernetes/serial/StartWithK8s 97.29
274 TestNetworkPlugins/group/false 2.92
278 TestNoKubernetes/serial/StartWithStopK8s 38.81
279 TestNoKubernetes/serial/Start 38.03
280 TestStartStop/group/old-k8s-version/serial/DeployApp 10.59
281 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 1.2
282 TestStartStop/group/old-k8s-version/serial/Stop 91.89
283 TestNoKubernetes/serial/VerifyK8sNotRunning 0.2
285 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.18
286 TestStartStop/group/old-k8s-version/serial/SecondStart 563.64
288 TestPause/serial/Start 73.26
289 TestPause/serial/SecondStartNoReconfiguration 41.17
290 TestPause/serial/Pause 0.68
291 TestPause/serial/VerifyStatus 0.26
292 TestPause/serial/Unpause 0.79
293 TestPause/serial/PauseAgain 1.03
294 TestPause/serial/DeletePaused 1.09
295 TestPause/serial/VerifyDeletedResources 0.55
297 TestStartStop/group/no-preload/serial/FirstStart 100.44
299 TestStartStop/group/embed-certs/serial/FirstStart 97.72
301 TestStartStop/group/default-k8s-diff-port/serial/FirstStart 85.23
302 TestStartStop/group/no-preload/serial/DeployApp 9.31
303 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 1.07
304 TestStartStop/group/no-preload/serial/Stop 91.64
305 TestStartStop/group/default-k8s-diff-port/serial/DeployApp 10.26
306 TestStartStop/group/embed-certs/serial/DeployApp 10.27
307 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive 1.01
308 TestStartStop/group/default-k8s-diff-port/serial/Stop 91.63
309 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.95
310 TestStartStop/group/embed-certs/serial/Stop 91.62
311 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.17
312 TestStartStop/group/no-preload/serial/SecondStart 309.32
313 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop 0.17
314 TestStartStop/group/default-k8s-diff-port/serial/SecondStart 317.04
315 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.18
316 TestStartStop/group/embed-certs/serial/SecondStart 311.2
317 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 6.01
318 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 6.08
319 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.22
320 TestStartStop/group/old-k8s-version/serial/Pause 2.54
322 TestStartStop/group/newest-cni/serial/FirstStart 51.6
323 TestStartStop/group/newest-cni/serial/DeployApp 0
324 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 1.13
325 TestStartStop/group/newest-cni/serial/Stop 91.72
326 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 9.02
327 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 6.08
328 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.23
329 TestStartStop/group/no-preload/serial/Pause 2.77
330 TestStoppedBinaryUpgrade/Setup 3.14
331 TestStoppedBinaryUpgrade/Upgrade 143.39
332 TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop 12.31
333 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 6.01
334 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.07
335 TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop 6.08
336 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.33
337 TestStartStop/group/embed-certs/serial/Pause 2.94
338 TestNetworkPlugins/group/auto/Start 59.62
339 TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages 0.22
340 TestStartStop/group/default-k8s-diff-port/serial/Pause 2.75
341 TestNetworkPlugins/group/kindnet/Start 93.88
342 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.17
343 TestStartStop/group/newest-cni/serial/SecondStart 77.69
344 TestNetworkPlugins/group/auto/KubeletFlags 0.24
345 TestNetworkPlugins/group/auto/NetCatPod 10.25
346 TestNetworkPlugins/group/auto/DNS 33.39
347 TestNetworkPlugins/group/kindnet/ControllerPod 6.01
348 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
349 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
350 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.27
351 TestStartStop/group/newest-cni/serial/Pause 3.59
352 TestNetworkPlugins/group/auto/Localhost 0.17
353 TestNetworkPlugins/group/auto/HairPin 0.16
354 TestStoppedBinaryUpgrade/MinikubeLogs 1.14
355 TestNetworkPlugins/group/kindnet/KubeletFlags 0.23
356 TestNetworkPlugins/group/kindnet/NetCatPod 10.33
357 TestNetworkPlugins/group/calico/Start 96.08
358 TestNetworkPlugins/group/custom-flannel/Start 117.49
359 TestNetworkPlugins/group/kindnet/DNS 0.16
360 TestNetworkPlugins/group/kindnet/Localhost 0.15
361 TestNetworkPlugins/group/kindnet/HairPin 0.16
362 TestNetworkPlugins/group/enable-default-cni/Start 109.34
363 TestNetworkPlugins/group/flannel/Start 141.51
364 TestNetworkPlugins/group/calico/ControllerPod 6.01
365 TestNetworkPlugins/group/calico/KubeletFlags 0.22
366 TestNetworkPlugins/group/calico/NetCatPod 9.23
367 TestNetworkPlugins/group/calico/DNS 0.17
368 TestNetworkPlugins/group/calico/Localhost 0.13
369 TestNetworkPlugins/group/calico/HairPin 0.13
370 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.24
371 TestNetworkPlugins/group/custom-flannel/NetCatPod 10.28
372 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.27
373 TestNetworkPlugins/group/enable-default-cni/NetCatPod 11.3
374 TestNetworkPlugins/group/custom-flannel/DNS 0.16
375 TestNetworkPlugins/group/custom-flannel/Localhost 0.14
376 TestNetworkPlugins/group/custom-flannel/HairPin 0.16
377 TestNetworkPlugins/group/bridge/Start 64.44
378 TestNetworkPlugins/group/enable-default-cni/DNS 0.18
379 TestNetworkPlugins/group/enable-default-cni/Localhost 0.13
380 TestNetworkPlugins/group/enable-default-cni/HairPin 0.14
381 TestNetworkPlugins/group/flannel/ControllerPod 6.01
382 TestNetworkPlugins/group/flannel/KubeletFlags 0.21
383 TestNetworkPlugins/group/flannel/NetCatPod 10.23
384 TestNetworkPlugins/group/flannel/DNS 0.16
385 TestNetworkPlugins/group/flannel/Localhost 0.13
386 TestNetworkPlugins/group/flannel/HairPin 0.12
387 TestNetworkPlugins/group/bridge/KubeletFlags 0.21
388 TestNetworkPlugins/group/bridge/NetCatPod 9.2
389 TestNetworkPlugins/group/bridge/DNS 0.15
390 TestNetworkPlugins/group/bridge/Localhost 0.13
391 TestNetworkPlugins/group/bridge/HairPin 0.11
x
+
TestDownloadOnly/v1.20.0/json-events (45.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -o=json --download-only -p download-only-492834 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd
aaa_download_only_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -o=json --download-only -p download-only-492834 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd: (45.222256713s)
--- PASS: TestDownloadOnly/v1.20.0/json-events (45.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/preload-exists
--- PASS: TestDownloadOnly/v1.20.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/LogsDuration (0.06s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-linux-amd64 logs -p download-only-492834
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-linux-amd64 logs -p download-only-492834: exit status 85 (56.385004ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-492834 | jenkins | v1.33.1 | 17 Jul 24 17:11 UTC |          |
	|         | -p download-only-492834        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |          |
	|         | --container-runtime=containerd |                      |         |         |                     |          |
	|         | --driver=kvm2                  |                      |         |         |                     |          |
	|         | --container-runtime=containerd |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:11:40
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:11:40.535774   21673 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:11:40.536040   21673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:11:40.536049   21673 out.go:304] Setting ErrFile to fd 2...
	I0717 17:11:40.536055   21673 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:11:40.536240   21673 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	W0717 17:11:40.536366   21673 root.go:314] Error reading config file at /home/jenkins/minikube-integration/19283-14409/.minikube/config/config.json: open /home/jenkins/minikube-integration/19283-14409/.minikube/config/config.json: no such file or directory
	I0717 17:11:40.536937   21673 out.go:298] Setting JSON to true
	I0717 17:11:40.537804   21673 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":3244,"bootTime":1721233057,"procs":174,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:11:40.537861   21673 start.go:139] virtualization: kvm guest
	I0717 17:11:40.540464   21673 out.go:97] [download-only-492834] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	W0717 17:11:40.540562   21673 preload.go:293] Failed to list preload files: open /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball: no such file or directory
	I0717 17:11:40.540618   21673 notify.go:220] Checking for updates...
	I0717 17:11:40.542252   21673 out.go:169] MINIKUBE_LOCATION=19283
	I0717 17:11:40.543653   21673 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:11:40.544963   21673 out.go:169] KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:11:40.546296   21673 out.go:169] MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:11:40.547847   21673 out.go:169] MINIKUBE_BIN=out/minikube-linux-amd64
	W0717 17:11:40.550259   21673 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0717 17:11:40.550523   21673 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:11:40.655175   21673 out.go:97] Using the kvm2 driver based on user configuration
	I0717 17:11:40.655204   21673 start.go:297] selected driver: kvm2
	I0717 17:11:40.655219   21673 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:11:40.655547   21673 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:11:40.655674   21673 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:11:40.670221   21673 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:11:40.670268   21673 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:11:40.670800   21673 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32089MB, container=0MB
	I0717 17:11:40.670985   21673 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0717 17:11:40.671050   21673 cni.go:84] Creating CNI manager for ""
	I0717 17:11:40.671066   21673 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0717 17:11:40.671078   21673 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0717 17:11:40.671139   21673 start.go:340] cluster config:
	{Name:download-only-492834 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:6000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.20.0 ClusterName:download-only-492834 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Cont
ainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.20.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:11:40.671328   21673 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:11:40.673324   21673 out.go:97] Downloading VM boot image ...
	I0717 17:11:40.673378   21673 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/iso/amd64/minikube-v1.33.1-1721146474-19264-amd64.iso
	I0717 17:11:54.123827   21673 out.go:97] Starting "download-only-492834" primary control-plane node in "download-only-492834" cluster
	I0717 17:11:54.123857   21673 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime containerd
	I0717 17:11:54.277363   21673 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4
	I0717 17:11:54.277389   21673 cache.go:56] Caching tarball of preloaded images
	I0717 17:11:54.277551   21673 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime containerd
	I0717 17:11:54.279403   21673 out.go:97] Downloading Kubernetes v1.20.0 preload ...
	I0717 17:11:54.279423   21673 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:11:54.439512   21673 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4?checksum=md5:c28dc5b6f01e4b826afa7afc8a0fd1fd -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4
	I0717 17:12:17.173870   21673 preload.go:247] saving checksum for preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:12:17.173961   21673 preload.go:254] verifying checksum of /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:12:18.074485   21673 cache.go:59] Finished verifying existence of preloaded tar for v1.20.0 on containerd
	I0717 17:12:18.074805   21673 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/download-only-492834/config.json ...
	I0717 17:12:18.074830   21673 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/download-only-492834/config.json: {Name:mk118659e98c3454738a482173f0b84ba5977eaa Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:12:18.074988   21673 preload.go:131] Checking if preload exists for k8s version v1.20.0 and runtime containerd
	I0717 17:12:18.075149   21673 download.go:107] Downloading: https://dl.k8s.io/release/v1.20.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.20.0/bin/linux/amd64/kubectl.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.20.0/kubectl
	
	
	* The control-plane node download-only-492834 host does not exist
	  To start a cluster, run: "minikube start -p download-only-492834"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.20.0/LogsDuration (0.06s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAll (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-linux-amd64 delete --all
--- PASS: TestDownloadOnly/v1.20.0/DeleteAll (0.13s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.12s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-linux-amd64 delete -p download-only-492834
--- PASS: TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.12s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/json-events (24.98s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -o=json --download-only -p download-only-835094 --force --alsologtostderr --kubernetes-version=v1.30.2 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd
aaa_download_only_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -o=json --download-only -p download-only-835094 --force --alsologtostderr --kubernetes-version=v1.30.2 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd: (24.976262802s)
--- PASS: TestDownloadOnly/v1.30.2/json-events (24.98s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/preload-exists
--- PASS: TestDownloadOnly/v1.30.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/LogsDuration (0.06s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-linux-amd64 logs -p download-only-835094
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-linux-amd64 logs -p download-only-835094: exit status 85 (56.331767ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only        | download-only-492834 | jenkins | v1.33.1 | 17 Jul 24 17:11 UTC |                     |
	|         | -p download-only-492834        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	|         | --driver=kvm2                  |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	| delete  | --all                          | minikube             | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC | 17 Jul 24 17:12 UTC |
	| delete  | -p download-only-492834        | download-only-492834 | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC | 17 Jul 24 17:12 UTC |
	| start   | -o=json --download-only        | download-only-835094 | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC |                     |
	|         | -p download-only-835094        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.30.2   |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	|         | --driver=kvm2                  |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:12:26
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:12:26.057924   21995 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:12:26.058212   21995 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:12:26.058223   21995 out.go:304] Setting ErrFile to fd 2...
	I0717 17:12:26.058227   21995 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:12:26.058426   21995 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:12:26.058971   21995 out.go:298] Setting JSON to true
	I0717 17:12:26.059762   21995 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":3289,"bootTime":1721233057,"procs":172,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:12:26.059819   21995 start.go:139] virtualization: kvm guest
	I0717 17:12:26.062040   21995 out.go:97] [download-only-835094] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:12:26.062172   21995 notify.go:220] Checking for updates...
	I0717 17:12:26.063655   21995 out.go:169] MINIKUBE_LOCATION=19283
	I0717 17:12:26.065131   21995 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:12:26.066668   21995 out.go:169] KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:12:26.067958   21995 out.go:169] MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:12:26.069315   21995 out.go:169] MINIKUBE_BIN=out/minikube-linux-amd64
	W0717 17:12:26.072044   21995 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0717 17:12:26.072313   21995 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:12:26.103242   21995 out.go:97] Using the kvm2 driver based on user configuration
	I0717 17:12:26.103285   21995 start.go:297] selected driver: kvm2
	I0717 17:12:26.103292   21995 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:12:26.103625   21995 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:12:26.103722   21995 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:12:26.118803   21995 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:12:26.118868   21995 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:12:26.119358   21995 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32089MB, container=0MB
	I0717 17:12:26.119495   21995 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0717 17:12:26.119548   21995 cni.go:84] Creating CNI manager for ""
	I0717 17:12:26.119560   21995 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0717 17:12:26.119567   21995 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0717 17:12:26.119619   21995 start.go:340] cluster config:
	{Name:download-only-835094 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:6000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:download-only-835094 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Cont
ainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:12:26.119716   21995 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:12:26.121383   21995 out.go:97] Starting "download-only-835094" primary control-plane node in "download-only-835094" cluster
	I0717 17:12:26.121398   21995 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:12:26.276853   21995 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.30.2/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:12:26.276881   21995 cache.go:56] Caching tarball of preloaded images
	I0717 17:12:26.277032   21995 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:12:26.279160   21995 out.go:97] Downloading Kubernetes v1.30.2 preload ...
	I0717 17:12:26.279180   21995 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:12:26.433322   21995 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.30.2/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4?checksum=md5:a69e65264a76d4a498a2c6efe8e151d6 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4
	I0717 17:12:39.174797   21995 preload.go:247] saving checksum for preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:12:39.174887   21995 preload.go:254] verifying checksum of /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:12:39.918356   21995 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on containerd
	I0717 17:12:39.918663   21995 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/download-only-835094/config.json ...
	I0717 17:12:39.918688   21995 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/download-only-835094/config.json: {Name:mk81f4e95dc51164d401f39870ffff95153a7b18 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:12:39.918861   21995 preload.go:131] Checking if preload exists for k8s version v1.30.2 and runtime containerd
	I0717 17:12:39.919031   21995 download.go:107] Downloading: https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.30.2/bin/linux/amd64/kubectl.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.30.2/kubectl
	
	
	* The control-plane node download-only-835094 host does not exist
	  To start a cluster, run: "minikube start -p download-only-835094"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.30.2/LogsDuration (0.06s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/DeleteAll (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-linux-amd64 delete --all
--- PASS: TestDownloadOnly/v1.30.2/DeleteAll (0.13s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds (0.12s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-linux-amd64 delete -p download-only-835094
--- PASS: TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds (0.12s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/json-events (28.44s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -o=json --download-only -p download-only-740215 --force --alsologtostderr --kubernetes-version=v1.31.0-beta.0 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd
aaa_download_only_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -o=json --download-only -p download-only-740215 --force --alsologtostderr --kubernetes-version=v1.31.0-beta.0 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd: (28.435405127s)
--- PASS: TestDownloadOnly/v1.31.0-beta.0/json-events (28.44s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/preload-exists
--- PASS: TestDownloadOnly/v1.31.0-beta.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/LogsDuration (0.06s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-linux-amd64 logs -p download-only-740215
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-linux-amd64 logs -p download-only-740215: exit status 85 (57.723377ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|-------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |                Args                 |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|-------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only             | download-only-492834 | jenkins | v1.33.1 | 17 Jul 24 17:11 UTC |                     |
	|         | -p download-only-492834             |                      |         |         |                     |                     |
	|         | --force --alsologtostderr           |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0        |                      |         |         |                     |                     |
	|         | --container-runtime=containerd      |                      |         |         |                     |                     |
	|         | --driver=kvm2                       |                      |         |         |                     |                     |
	|         | --container-runtime=containerd      |                      |         |         |                     |                     |
	| delete  | --all                               | minikube             | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC | 17 Jul 24 17:12 UTC |
	| delete  | -p download-only-492834             | download-only-492834 | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC | 17 Jul 24 17:12 UTC |
	| start   | -o=json --download-only             | download-only-835094 | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC |                     |
	|         | -p download-only-835094             |                      |         |         |                     |                     |
	|         | --force --alsologtostderr           |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.30.2        |                      |         |         |                     |                     |
	|         | --container-runtime=containerd      |                      |         |         |                     |                     |
	|         | --driver=kvm2                       |                      |         |         |                     |                     |
	|         | --container-runtime=containerd      |                      |         |         |                     |                     |
	| delete  | --all                               | minikube             | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC | 17 Jul 24 17:12 UTC |
	| delete  | -p download-only-835094             | download-only-835094 | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC | 17 Jul 24 17:12 UTC |
	| start   | -o=json --download-only             | download-only-740215 | jenkins | v1.33.1 | 17 Jul 24 17:12 UTC |                     |
	|         | -p download-only-740215             |                      |         |         |                     |                     |
	|         | --force --alsologtostderr           |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.31.0-beta.0 |                      |         |         |                     |                     |
	|         | --container-runtime=containerd      |                      |         |         |                     |                     |
	|         | --driver=kvm2                       |                      |         |         |                     |                     |
	|         | --container-runtime=containerd      |                      |         |         |                     |                     |
	|---------|-------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/17 17:12:51
	Running on machine: ubuntu-20-agent-5
	Binary: Built with gc go1.22.5 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0717 17:12:51.336890   22255 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:12:51.336982   22255 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:12:51.336990   22255 out.go:304] Setting ErrFile to fd 2...
	I0717 17:12:51.336995   22255 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:12:51.337148   22255 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:12:51.337690   22255 out.go:298] Setting JSON to true
	I0717 17:12:51.338534   22255 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":3314,"bootTime":1721233057,"procs":172,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:12:51.338590   22255 start.go:139] virtualization: kvm guest
	I0717 17:12:51.340612   22255 out.go:97] [download-only-740215] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:12:51.340751   22255 notify.go:220] Checking for updates...
	I0717 17:12:51.341985   22255 out.go:169] MINIKUBE_LOCATION=19283
	I0717 17:12:51.343439   22255 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:12:51.344892   22255 out.go:169] KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:12:51.346172   22255 out.go:169] MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:12:51.347421   22255 out.go:169] MINIKUBE_BIN=out/minikube-linux-amd64
	W0717 17:12:51.349466   22255 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0717 17:12:51.349669   22255 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:12:51.380443   22255 out.go:97] Using the kvm2 driver based on user configuration
	I0717 17:12:51.380469   22255 start.go:297] selected driver: kvm2
	I0717 17:12:51.380480   22255 start.go:901] validating driver "kvm2" against <nil>
	I0717 17:12:51.380799   22255 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:12:51.380863   22255 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/19283-14409/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0717 17:12:51.395226   22255 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.33.1
	I0717 17:12:51.395276   22255 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0717 17:12:51.395768   22255 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32089MB, container=0MB
	I0717 17:12:51.395905   22255 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0717 17:12:51.395928   22255 cni.go:84] Creating CNI manager for ""
	I0717 17:12:51.395935   22255 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0717 17:12:51.395944   22255 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0717 17:12:51.396002   22255 start.go:340] cluster config:
	{Name:download-only-740215 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:6000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.31.0-beta.0 ClusterName:download-only-740215 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.loc
al ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.31.0-beta.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1
m0s}
	I0717 17:12:51.396096   22255 iso.go:125] acquiring lock: {Name:mk9ca422a70055a342d5e4afb354786e16c8e9d2 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0717 17:12:51.397670   22255 out.go:97] Starting "download-only-740215" primary control-plane node in "download-only-740215" cluster
	I0717 17:12:51.397694   22255 preload.go:131] Checking if preload exists for k8s version v1.31.0-beta.0 and runtime containerd
	I0717 17:12:51.627757   22255 preload.go:118] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.31.0-beta.0/preloaded-images-k8s-v18-v1.31.0-beta.0-containerd-overlay2-amd64.tar.lz4
	I0717 17:12:51.627826   22255 cache.go:56] Caching tarball of preloaded images
	I0717 17:12:51.628011   22255 preload.go:131] Checking if preload exists for k8s version v1.31.0-beta.0 and runtime containerd
	I0717 17:12:51.630077   22255 out.go:97] Downloading Kubernetes v1.31.0-beta.0 preload ...
	I0717 17:12:51.630099   22255 preload.go:236] getting checksum for preloaded-images-k8s-v18-v1.31.0-beta.0-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:12:51.787797   22255 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.31.0-beta.0/preloaded-images-k8s-v18-v1.31.0-beta.0-containerd-overlay2-amd64.tar.lz4?checksum=md5:317e542de842a84eade9a0e3b4ea7005 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-beta.0-containerd-overlay2-amd64.tar.lz4
	I0717 17:13:09.349670   22255 preload.go:247] saving checksum for preloaded-images-k8s-v18-v1.31.0-beta.0-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:13:09.349799   22255 preload.go:254] verifying checksum of /home/jenkins/minikube-integration/19283-14409/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.31.0-beta.0-containerd-overlay2-amd64.tar.lz4 ...
	I0717 17:13:10.190933   22255 cache.go:59] Finished verifying existence of preloaded tar for v1.31.0-beta.0 on containerd
	I0717 17:13:10.191275   22255 profile.go:143] Saving config to /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/download-only-740215/config.json ...
	I0717 17:13:10.191312   22255 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/download-only-740215/config.json: {Name:mk82c4fc157a02b782032a0ab31698d4d537cb51 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0717 17:13:10.191489   22255 preload.go:131] Checking if preload exists for k8s version v1.31.0-beta.0 and runtime containerd
	I0717 17:13:10.191641   22255 download.go:107] Downloading: https://dl.k8s.io/release/v1.31.0-beta.0/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.31.0-beta.0/bin/linux/amd64/kubectl.sha256 -> /home/jenkins/minikube-integration/19283-14409/.minikube/cache/linux/amd64/v1.31.0-beta.0/kubectl
	
	
	* The control-plane node download-only-740215 host does not exist
	  To start a cluster, run: "minikube start -p download-only-740215"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.31.0-beta.0/LogsDuration (0.06s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/DeleteAll (0.13s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-linux-amd64 delete --all
--- PASS: TestDownloadOnly/v1.31.0-beta.0/DeleteAll (0.13s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds (0.12s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-linux-amd64 delete -p download-only-740215
--- PASS: TestDownloadOnly/v1.31.0-beta.0/DeleteAlwaysSucceeds (0.12s)

                                                
                                    
x
+
TestBinaryMirror (0.55s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:314: (dbg) Run:  out/minikube-linux-amd64 start --download-only -p binary-mirror-401955 --alsologtostderr --binary-mirror http://127.0.0.1:35181 --driver=kvm2  --container-runtime=containerd
helpers_test.go:175: Cleaning up "binary-mirror-401955" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p binary-mirror-401955
--- PASS: TestBinaryMirror (0.55s)

                                                
                                    
x
+
TestOffline (86.97s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-linux-amd64 start -p offline-containerd-956029 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=kvm2  --container-runtime=containerd
aab_offline_test.go:55: (dbg) Done: out/minikube-linux-amd64 start -p offline-containerd-956029 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=kvm2  --container-runtime=containerd: (1m25.500896063s)
helpers_test.go:175: Cleaning up "offline-containerd-956029" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p offline-containerd-956029
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p offline-containerd-956029: (1.464516722s)
--- PASS: TestOffline (86.97s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.05s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1029: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p addons-566926
addons_test.go:1029: (dbg) Non-zero exit: out/minikube-linux-amd64 addons enable dashboard -p addons-566926: exit status 85 (51.478752ms)

                                                
                                                
-- stdout --
	* Profile "addons-566926" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-566926"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.05s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.05s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1040: (dbg) Run:  out/minikube-linux-amd64 addons disable dashboard -p addons-566926
addons_test.go:1040: (dbg) Non-zero exit: out/minikube-linux-amd64 addons disable dashboard -p addons-566926: exit status 85 (48.669777ms)

                                                
                                                
-- stdout --
	* Profile "addons-566926" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-566926"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.05s)

                                                
                                    
x
+
TestAddons/Setup (271.13s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-linux-amd64 start -p addons-566926 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=kvm2  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:110: (dbg) Done: out/minikube-linux-amd64 start -p addons-566926 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=kvm2  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=helm-tiller: (4m31.128330503s)
--- PASS: TestAddons/Setup (271.13s)

                                                
                                    
x
+
TestAddons/parallel/Registry (19.86s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:332: registry stabilized in 27.572276ms
addons_test.go:334: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-656c9c8d9c-gj8zf" [d3c3f3ea-ee16-4c2e-b37e-cd5b309202ad] Running
addons_test.go:334: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.008696563s
addons_test.go:337: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-f5xc7" [59fb7f1d-52e5-4fad-be04-d8d6a1db1e87] Running
addons_test.go:337: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 6.004830986s
addons_test.go:342: (dbg) Run:  kubectl --context addons-566926 delete po -l run=registry-test --now
addons_test.go:347: (dbg) Run:  kubectl --context addons-566926 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:347: (dbg) Done: kubectl --context addons-566926 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (8.035153744s)
addons_test.go:361: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 ip
2024/07/17 17:18:11 [DEBUG] GET http://192.168.39.49:5000
addons_test.go:390: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (19.86s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (21.2s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-566926 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-566926 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-566926 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [af597576-4706-45ea-a875-d30d1b966095] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [af597576-4706-45ea-a875-d30d1b966095] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 11.004435873s
addons_test.go:264: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-566926 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.168.39.49
addons_test.go:308: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:308: (dbg) Done: out/minikube-linux-amd64 -p addons-566926 addons disable ingress-dns --alsologtostderr -v=1: (1.278840779s)
addons_test.go:313: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable ingress --alsologtostderr -v=1
addons_test.go:313: (dbg) Done: out/minikube-linux-amd64 -p addons-566926 addons disable ingress --alsologtostderr -v=1: (7.753722395s)
--- PASS: TestAddons/parallel/Ingress (21.20s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.86s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:840: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-znkkg" [bd7cabbd-227a-4724-b2b9-ab985484d3ad] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:840: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.005097614s
addons_test.go:843: (dbg) Run:  out/minikube-linux-amd64 addons disable inspektor-gadget -p addons-566926
addons_test.go:843: (dbg) Done: out/minikube-linux-amd64 addons disable inspektor-gadget -p addons-566926: (5.851747891s)
--- PASS: TestAddons/parallel/InspektorGadget (10.86s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.8s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:409: metrics-server stabilized in 2.806307ms
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-c59844bb4-r6lck" [e18b62f8-3ec0-4f40-b71d-e49db08062ba] Running
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.0049344s
addons_test.go:417: (dbg) Run:  kubectl --context addons-566926 top pods -n kube-system
addons_test.go:434: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.80s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (16.67s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:458: tiller-deploy stabilized in 27.886146ms
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-6677d64bcd-sfqld" [94fc4bf6-8a88-45de-9b6c-d2984fd4e6f4] Running
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.02117424s
addons_test.go:475: (dbg) Run:  kubectl --context addons-566926 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:475: (dbg) Done: kubectl --context addons-566926 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (10.949167s)
addons_test.go:492: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (16.67s)

                                                
                                    
x
+
TestAddons/parallel/CSI (53.36s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:563: csi-hostpath-driver pods stabilized in 6.974138ms
addons_test.go:566: (dbg) Run:  kubectl --context addons-566926 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:571: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:576: (dbg) Run:  kubectl --context addons-566926 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:581: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [d9800f98-d745-4597-a08b-f3a15ffcf2da] Pending
helpers_test.go:344: "task-pv-pod" [d9800f98-d745-4597-a08b-f3a15ffcf2da] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [d9800f98-d745-4597-a08b-f3a15ffcf2da] Running
addons_test.go:581: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 15.004912279s
addons_test.go:586: (dbg) Run:  kubectl --context addons-566926 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:591: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-566926 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:427: TestAddons/parallel/CSI: WARNING: volume snapshot get for "default" "new-snapshot-demo" returned: 
helpers_test.go:419: (dbg) Run:  kubectl --context addons-566926 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:596: (dbg) Run:  kubectl --context addons-566926 delete pod task-pv-pod
addons_test.go:602: (dbg) Run:  kubectl --context addons-566926 delete pvc hpvc
addons_test.go:608: (dbg) Run:  kubectl --context addons-566926 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:613: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:618: (dbg) Run:  kubectl --context addons-566926 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:623: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [fa40ef9b-4f16-488c-be26-e19a57fd86ef] Pending
helpers_test.go:344: "task-pv-pod-restore" [fa40ef9b-4f16-488c-be26-e19a57fd86ef] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [fa40ef9b-4f16-488c-be26-e19a57fd86ef] Running
addons_test.go:623: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 9.003604171s
addons_test.go:628: (dbg) Run:  kubectl --context addons-566926 delete pod task-pv-pod-restore
addons_test.go:632: (dbg) Run:  kubectl --context addons-566926 delete pvc hpvc-restore
addons_test.go:636: (dbg) Run:  kubectl --context addons-566926 delete volumesnapshot new-snapshot-demo
addons_test.go:640: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:640: (dbg) Done: out/minikube-linux-amd64 -p addons-566926 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.778017526s)
addons_test.go:644: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (53.36s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (16.99s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:826: (dbg) Run:  out/minikube-linux-amd64 addons enable headlamp -p addons-566926 --alsologtostderr -v=1
addons_test.go:831: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-7867546754-wdbs2" [5231d5b3-f073-44ca-b721-cf786d29f1a8] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-7867546754-wdbs2" [5231d5b3-f073-44ca-b721-cf786d29f1a8] Running
addons_test.go:831: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 16.008966537s
--- PASS: TestAddons/parallel/Headlamp (16.99s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.54s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:859: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-6fcd4f6f98-vq76m" [cbddf6d4-f05a-4421-81e4-9ea90bf13796] Running
addons_test.go:859: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.004671395s
addons_test.go:862: (dbg) Run:  out/minikube-linux-amd64 addons disable cloud-spanner -p addons-566926
--- PASS: TestAddons/parallel/CloudSpanner (5.54s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (17.13s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:974: (dbg) Run:  kubectl --context addons-566926 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:980: (dbg) Run:  kubectl --context addons-566926 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:984: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:987: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:344: "test-local-path" [934a1b28-bcb2-4ffa-83de-54e533bab845] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "test-local-path" [934a1b28-bcb2-4ffa-83de-54e533bab845] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "test-local-path" [934a1b28-bcb2-4ffa-83de-54e533bab845] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:987: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 10.003804012s
addons_test.go:992: (dbg) Run:  kubectl --context addons-566926 get pvc test-pvc -o=json
addons_test.go:1001: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 ssh "cat /opt/local-path-provisioner/pvc-4eff94a2-b25f-428a-a290-d81978cb97b2_default_test-pvc/file1"
addons_test.go:1013: (dbg) Run:  kubectl --context addons-566926 delete pod test-local-path
addons_test.go:1017: (dbg) Run:  kubectl --context addons-566926 delete pvc test-pvc
addons_test.go:1021: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable storage-provisioner-rancher --alsologtostderr -v=1
--- PASS: TestAddons/parallel/LocalPath (17.13s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (6.68s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1053: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:344: "nvidia-device-plugin-daemonset-bdvfc" [67c3f695-a23c-4ad0-94a3-aa2ba8ef0a01] Running
addons_test.go:1053: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 6.009228573s
addons_test.go:1056: (dbg) Run:  out/minikube-linux-amd64 addons disable nvidia-device-plugin -p addons-566926
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (6.68s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (6.01s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1064: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:344: "yakd-dashboard-799879c74f-xm7st" [da5f31fd-cfb4-434b-aa1d-933f6ddd07e2] Running
addons_test.go:1064: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 6.006831981s
--- PASS: TestAddons/parallel/Yakd (6.01s)

                                                
                                    
x
+
TestAddons/parallel/Volcano (38.7s)

                                                
                                                
=== RUN   TestAddons/parallel/Volcano
=== PAUSE TestAddons/parallel/Volcano

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Volcano
addons_test.go:905: volcano-controller stabilized in 7.493991ms
addons_test.go:897: volcano-admission stabilized in 10.273176ms
addons_test.go:889: volcano-scheduler stabilized in 10.326998ms
addons_test.go:911: (dbg) TestAddons/parallel/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-scheduler-844f6db89b-stgjv" [93bfd198-cc8a-42b0-87c7-8976e2c3c5f8] Running
addons_test.go:911: (dbg) TestAddons/parallel/Volcano: app=volcano-scheduler healthy within 6.016209825s
addons_test.go:915: (dbg) TestAddons/parallel/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-admission-5f7844f7bc-rrckt" [42c0567e-15b3-4c2b-9317-90be15d60515] Running
addons_test.go:915: (dbg) TestAddons/parallel/Volcano: app=volcano-admission healthy within 5.007537504s
addons_test.go:919: (dbg) TestAddons/parallel/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-controllers-59cb4746db-9rls2" [f9834d58-2f8f-44d5-9030-25569086574b] Running
addons_test.go:919: (dbg) TestAddons/parallel/Volcano: app=volcano-controller healthy within 5.004810359s
addons_test.go:924: (dbg) Run:  kubectl --context addons-566926 delete -n volcano-system job volcano-admission-init
addons_test.go:930: (dbg) Run:  kubectl --context addons-566926 create -f testdata/vcjob.yaml
addons_test.go:938: (dbg) Run:  kubectl --context addons-566926 get vcjob -n my-volcano
addons_test.go:956: (dbg) TestAddons/parallel/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:344: "test-job-nginx-0" [a75119f2-4df6-40e0-b463-5ac1e83140fb] Pending
helpers_test.go:344: "test-job-nginx-0" [a75119f2-4df6-40e0-b463-5ac1e83140fb] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "test-job-nginx-0" [a75119f2-4df6-40e0-b463-5ac1e83140fb] Running
addons_test.go:956: (dbg) TestAddons/parallel/Volcano: volcano.sh/job-name=test-job healthy within 12.003690806s
addons_test.go:960: (dbg) Run:  out/minikube-linux-amd64 -p addons-566926 addons disable volcano --alsologtostderr -v=1
addons_test.go:960: (dbg) Done: out/minikube-linux-amd64 -p addons-566926 addons disable volcano --alsologtostderr -v=1: (10.291126615s)
--- PASS: TestAddons/parallel/Volcano (38.70s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.13s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:652: (dbg) Run:  kubectl --context addons-566926 create ns new-namespace
addons_test.go:666: (dbg) Run:  kubectl --context addons-566926 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.13s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (92.65s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-linux-amd64 stop -p addons-566926
addons_test.go:174: (dbg) Done: out/minikube-linux-amd64 stop -p addons-566926: (1m32.400231172s)
addons_test.go:178: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p addons-566926
addons_test.go:182: (dbg) Run:  out/minikube-linux-amd64 addons disable dashboard -p addons-566926
addons_test.go:187: (dbg) Run:  out/minikube-linux-amd64 addons disable gvisor -p addons-566926
--- PASS: TestAddons/StoppedEnableDisable (92.65s)

                                                
                                    
x
+
TestCertOptions (51.09s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-amd64 start -p cert-options-761292 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=kvm2  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-amd64 start -p cert-options-761292 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=kvm2  --container-runtime=containerd: (49.846319239s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-amd64 -p cert-options-761292 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-761292 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-amd64 ssh -p cert-options-761292 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-761292" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p cert-options-761292
--- PASS: TestCertOptions (51.09s)

                                                
                                    
x
+
TestCertExpiration (258.04s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-amd64 start -p cert-expiration-930377 --memory=2048 --cert-expiration=3m --driver=kvm2  --container-runtime=containerd
cert_options_test.go:123: (dbg) Done: out/minikube-linux-amd64 start -p cert-expiration-930377 --memory=2048 --cert-expiration=3m --driver=kvm2  --container-runtime=containerd: (1m6.453346741s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-amd64 start -p cert-expiration-930377 --memory=2048 --cert-expiration=8760h --driver=kvm2  --container-runtime=containerd
E0717 18:29:41.797520   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Done: out/minikube-linux-amd64 start -p cert-expiration-930377 --memory=2048 --cert-expiration=8760h --driver=kvm2  --container-runtime=containerd: (10.548266052s)
helpers_test.go:175: Cleaning up "cert-expiration-930377" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p cert-expiration-930377
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p cert-expiration-930377: (1.035870997s)
--- PASS: TestCertExpiration (258.04s)

                                                
                                    
x
+
TestForceSystemdFlag (52.87s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-amd64 start -p force-systemd-flag-081918 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd
docker_test.go:91: (dbg) Done: out/minikube-linux-amd64 start -p force-systemd-flag-081918 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd: (51.70370136s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-amd64 -p force-systemd-flag-081918 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-flag-081918" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p force-systemd-flag-081918
--- PASS: TestForceSystemdFlag (52.87s)

                                                
                                    
x
+
TestForceSystemdEnv (47.52s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-amd64 start -p force-systemd-env-586532 --memory=2048 --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd
docker_test.go:155: (dbg) Done: out/minikube-linux-amd64 start -p force-systemd-env-586532 --memory=2048 --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd: (46.324400082s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-amd64 -p force-systemd-env-586532 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-env-586532" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p force-systemd-env-586532
--- PASS: TestForceSystemdEnv (47.52s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (22.75s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
=== PAUSE TestKVMDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestKVMDriverInstallOrUpdate
--- PASS: TestKVMDriverInstallOrUpdate (22.75s)

                                                
                                    
x
+
TestErrorSpam/setup (40.47s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -p nospam-897711 -n=1 --memory=2250 --wait=false --log_dir=/tmp/nospam-897711 --driver=kvm2  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -p nospam-897711 -n=1 --memory=2250 --wait=false --log_dir=/tmp/nospam-897711 --driver=kvm2  --container-runtime=containerd: (40.471883055s)
--- PASS: TestErrorSpam/setup (40.47s)

                                                
                                    
x
+
TestErrorSpam/start (0.32s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 start --dry-run
--- PASS: TestErrorSpam/start (0.32s)

                                                
                                    
x
+
TestErrorSpam/status (0.71s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 status
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 status
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 status
--- PASS: TestErrorSpam/status (0.71s)

                                                
                                    
x
+
TestErrorSpam/pause (1.5s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 pause
--- PASS: TestErrorSpam/pause (1.50s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.52s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 unpause
--- PASS: TestErrorSpam/unpause (1.52s)

                                                
                                    
x
+
TestErrorSpam/stop (5.03s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 stop
error_spam_test.go:159: (dbg) Done: out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 stop: (1.413329319s)
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 stop
error_spam_test.go:159: (dbg) Done: out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 stop: (1.995611691s)
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 stop
error_spam_test.go:182: (dbg) Done: out/minikube-linux-amd64 -p nospam-897711 --log_dir /tmp/nospam-897711 stop: (1.623217266s)
--- PASS: TestErrorSpam/stop (5.03s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1851: local sync path: /home/jenkins/minikube-integration/19283-14409/.minikube/files/etc/test/nested/copy/21661/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (83.77s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2230: (dbg) Run:  out/minikube-linux-amd64 start -p functional-142583 --memory=4000 --apiserver-port=8441 --wait=all --driver=kvm2  --container-runtime=containerd
E0717 17:22:52.134931   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.140585   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.150815   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.171082   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.211336   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.291652   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.452044   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:52.772600   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:53.413521   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:54.693998   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:22:57.254771   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
functional_test.go:2230: (dbg) Done: out/minikube-linux-amd64 start -p functional-142583 --memory=4000 --apiserver-port=8441 --wait=all --driver=kvm2  --container-runtime=containerd: (1m23.768244522s)
--- PASS: TestFunctional/serial/StartWithProxy (83.77s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (42.2s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-linux-amd64 start -p functional-142583 --alsologtostderr -v=8
E0717 17:23:02.375334   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:23:12.615777   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 17:23:33.096841   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
functional_test.go:655: (dbg) Done: out/minikube-linux-amd64 start -p functional-142583 --alsologtostderr -v=8: (42.196250337s)
functional_test.go:659: soft start took 42.196806658s for "functional-142583" cluster.
--- PASS: TestFunctional/serial/SoftStart (42.20s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-142583 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.48s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 cache add registry.k8s.io/pause:3.1: (1.159558231s)
functional_test.go:1045: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 cache add registry.k8s.io/pause:3.3: (1.205738749s)
functional_test.go:1045: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cache add registry.k8s.io/pause:latest
functional_test.go:1045: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 cache add registry.k8s.io/pause:latest: (1.112657065s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.48s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (2.95s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-142583 /tmp/TestFunctionalserialCacheCmdcacheadd_local2368924870/001
functional_test.go:1085: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cache add minikube-local-cache-test:functional-142583
functional_test.go:1085: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 cache add minikube-local-cache-test:functional-142583: (2.658089928s)
functional_test.go:1090: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cache delete minikube-local-cache-test:functional-142583
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-142583
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (2.95s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-linux-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.04s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-linux-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.04s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.22s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.22s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.71s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (205.015453ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cache reload
functional_test.go:1154: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 cache reload: (1.064728867s)
functional_test.go:1159: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.71s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.09s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-linux-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1168: (dbg) Run:  out/minikube-linux-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.09s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.1s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 kubectl -- --context functional-142583 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.10s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.1s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-142583 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.10s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (44.52s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-linux-amd64 start -p functional-142583 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0717 17:24:14.058856   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
functional_test.go:753: (dbg) Done: out/minikube-linux-amd64 start -p functional-142583 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (44.520572321s)
functional_test.go:757: restart took 44.520680301s for "functional-142583" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (44.52s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-142583 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.38s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 logs
functional_test.go:1232: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 logs: (1.381924297s)
--- PASS: TestFunctional/serial/LogsCmd (1.38s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.41s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 logs --file /tmp/TestFunctionalserialLogsFileCmd3719721722/001/logs.txt
functional_test.go:1246: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 logs --file /tmp/TestFunctionalserialLogsFileCmd3719721722/001/logs.txt: (1.405660115s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.41s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.23s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2317: (dbg) Run:  kubectl --context functional-142583 apply -f testdata/invalidsvc.yaml
functional_test.go:2331: (dbg) Run:  out/minikube-linux-amd64 service invalid-svc -p functional-142583
functional_test.go:2331: (dbg) Non-zero exit: out/minikube-linux-amd64 service invalid-svc -p functional-142583: exit status 115 (265.690319ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|-----------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |             URL             |
	|-----------|-------------|-------------|-----------------------------|
	| default   | invalid-svc |          80 | http://192.168.50.115:31494 |
	|-----------|-------------|-------------|-----------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2323: (dbg) Run:  kubectl --context functional-142583 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.23s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 config get cpus: exit status 14 (49.889409ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 config get cpus: exit status 14 (43.956867ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (20.69s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-142583 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-142583 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 29887: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (20.69s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-linux-amd64 start -p functional-142583 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd
functional_test.go:970: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p functional-142583 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd: exit status 23 (174.452582ms)

                                                
                                                
-- stdout --
	* [functional-142583] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:24:43.096483   29436 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:24:43.096653   29436 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:24:43.096667   29436 out.go:304] Setting ErrFile to fd 2...
	I0717 17:24:43.096675   29436 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:24:43.096985   29436 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:24:43.097742   29436 out.go:298] Setting JSON to false
	I0717 17:24:43.099097   29436 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4026,"bootTime":1721233057,"procs":214,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:24:43.099182   29436 start.go:139] virtualization: kvm guest
	I0717 17:24:43.101539   29436 out.go:177] * [functional-142583] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 17:24:43.103276   29436 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:24:43.103309   29436 notify.go:220] Checking for updates...
	I0717 17:24:43.105957   29436 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:24:43.107380   29436 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:24:43.108842   29436 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:24:43.110262   29436 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:24:43.111636   29436 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:24:43.113438   29436 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:24:43.113996   29436 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:24:43.114070   29436 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:24:43.132101   29436 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40805
	I0717 17:24:43.132685   29436 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:24:43.133265   29436 main.go:141] libmachine: Using API Version  1
	I0717 17:24:43.133287   29436 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:24:43.133618   29436 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:24:43.134010   29436 main.go:141] libmachine: (functional-142583) Calling .DriverName
	I0717 17:24:43.134269   29436 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:24:43.134547   29436 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:24:43.134582   29436 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:24:43.151082   29436 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35163
	I0717 17:24:43.151525   29436 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:24:43.152064   29436 main.go:141] libmachine: Using API Version  1
	I0717 17:24:43.152090   29436 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:24:43.152413   29436 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:24:43.152609   29436 main.go:141] libmachine: (functional-142583) Calling .DriverName
	I0717 17:24:43.189453   29436 out.go:177] * Using the kvm2 driver based on existing profile
	I0717 17:24:43.190848   29436 start.go:297] selected driver: kvm2
	I0717 17:24:43.190878   29436 start.go:901] validating driver "kvm2" against &{Name:functional-142583 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.2 ClusterName:functional-142583 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.50.115 Port:8441 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2
6280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:24:43.191043   29436 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:24:43.194665   29436 out.go:177] 
	W0717 17:24:43.218359   29436 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0717 17:24:43.219715   29436 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-linux-amd64 start -p functional-142583 --dry-run --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-linux-amd64 start -p functional-142583 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p functional-142583 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd: exit status 23 (147.141608ms)

                                                
                                                
-- stdout --
	* [functional-142583] minikube v1.33.1 sur Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote kvm2 basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 17:24:42.952697   29403 out.go:291] Setting OutFile to fd 1 ...
	I0717 17:24:42.952815   29403 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:24:42.952823   29403 out.go:304] Setting ErrFile to fd 2...
	I0717 17:24:42.952828   29403 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 17:24:42.953095   29403 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 17:24:42.953567   29403 out.go:298] Setting JSON to false
	I0717 17:24:42.954455   29403 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":4026,"bootTime":1721233057,"procs":212,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 17:24:42.954527   29403 start.go:139] virtualization: kvm guest
	I0717 17:24:42.956699   29403 out.go:177] * [functional-142583] minikube v1.33.1 sur Ubuntu 20.04 (kvm/amd64)
	I0717 17:24:42.957976   29403 notify.go:220] Checking for updates...
	I0717 17:24:42.957979   29403 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 17:24:42.959505   29403 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 17:24:42.960811   29403 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 17:24:42.961966   29403 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 17:24:42.963104   29403 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 17:24:42.964314   29403 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 17:24:42.965755   29403 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 17:24:42.966110   29403 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:24:42.966196   29403 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:24:42.983675   29403 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44059
	I0717 17:24:42.984126   29403 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:24:42.984625   29403 main.go:141] libmachine: Using API Version  1
	I0717 17:24:42.984648   29403 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:24:42.985009   29403 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:24:42.985202   29403 main.go:141] libmachine: (functional-142583) Calling .DriverName
	I0717 17:24:42.985423   29403 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 17:24:42.985805   29403 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 17:24:42.985943   29403 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 17:24:43.000842   29403 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45471
	I0717 17:24:43.001225   29403 main.go:141] libmachine: () Calling .GetVersion
	I0717 17:24:43.001784   29403 main.go:141] libmachine: Using API Version  1
	I0717 17:24:43.001814   29403 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 17:24:43.002141   29403 main.go:141] libmachine: () Calling .GetMachineName
	I0717 17:24:43.002323   29403 main.go:141] libmachine: (functional-142583) Calling .DriverName
	I0717 17:24:43.038381   29403 out.go:177] * Utilisation du pilote kvm2 basé sur le profil existant
	I0717 17:24:43.039813   29403 start.go:297] selected driver: kvm2
	I0717 17:24:43.039835   29403 start.go:901] validating driver "kvm2" against &{Name:functional-142583 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19264/minikube-v1.33.1-1721146474-19264-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1721146479-19264@sha256:7ee06b7e8fb4a6c7fce11a567253ea7d43fed61ee0beca281a1ac2c2566a2a2e Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.30.2 ClusterName:functional-142583 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.50.115 Port:8441 KubernetesVersion:v1.30.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2
6280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0717 17:24:43.039983   29403 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 17:24:43.042366   29403 out.go:177] 
	W0717 17:24:43.043640   29403 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0717 17:24:43.044999   29403 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.89s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 status
functional_test.go:856: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.89s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (8.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1625: (dbg) Run:  kubectl --context functional-142583 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1631: (dbg) Run:  kubectl --context functional-142583 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-57b4589c47-5999w" [8e97678f-387d-4b5f-92fb-4a8b4bc45bf4] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-connect-57b4589c47-5999w" [8e97678f-387d-4b5f-92fb-4a8b4bc45bf4] Running
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 8.004615235s
functional_test.go:1645: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 service hello-node-connect --url
functional_test.go:1651: found endpoint for hello-node-connect: http://192.168.50.115:30751
functional_test.go:1671: http://192.168.50.115:30751: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-57b4589c47-5999w

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.50.115:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.50.115:30751
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (8.56s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1686: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 addons list
functional_test.go:1698: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.12s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (39.98s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [127c283f-ccb4-440b-969c-70b0e517b514] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.005201006s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-142583 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-142583 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-142583 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-142583 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [63d109a7-0b4b-44bb-a27a-6ec1d5997cfd] Pending
helpers_test.go:344: "sp-pod" [63d109a7-0b4b-44bb-a27a-6ec1d5997cfd] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
2024/07/17 17:25:03 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
helpers_test.go:344: "sp-pod" [63d109a7-0b4b-44bb-a27a-6ec1d5997cfd] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 26.004288997s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-142583 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-142583 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:106: (dbg) Done: kubectl --context functional-142583 delete -f testdata/storage-provisioner/pod.yaml: (1.287314482s)
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-142583 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [f5808ec8-4cf0-4e6a-a806-7080f2eda203] Pending
helpers_test.go:344: "sp-pod" [f5808ec8-4cf0-4e6a-a806-7080f2eda203] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [f5808ec8-4cf0-4e6a-a806-7080f2eda203] Running
E0717 17:25:35.979726   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.004578694s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-142583 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (39.98s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.42s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1721: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "echo hello"
functional_test.go:1738: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.42s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (1.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh -n functional-142583 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cp functional-142583:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd557896621/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh -n functional-142583 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh -n functional-142583 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (1.31s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (34.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1789: (dbg) Run:  kubectl --context functional-142583 replace --force -f testdata/mysql.yaml
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-64454c8b5c-lzb5g" [3581741b-6a8a-44ae-99aa-cf17c49b2026] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-64454c8b5c-lzb5g" [3581741b-6a8a-44ae-99aa-cf17c49b2026] Running
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 25.004481468s
functional_test.go:1803: (dbg) Run:  kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;": exit status 1 (168.811377ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;": exit status 1 (140.286557ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;": exit status 1 (169.546354ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;": exit status 1 (134.002199ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-142583 exec mysql-64454c8b5c-lzb5g -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (34.02s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1925: Checking for existence of /etc/test/nested/copy/21661/hosts within VM
functional_test.go:1927: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /etc/test/nested/copy/21661/hosts"
functional_test.go:1932: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (1.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1968: Checking for existence of /etc/ssl/certs/21661.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /etc/ssl/certs/21661.pem"
functional_test.go:1968: Checking for existence of /usr/share/ca-certificates/21661.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /usr/share/ca-certificates/21661.pem"
functional_test.go:1968: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1969: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/216612.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /etc/ssl/certs/216612.pem"
functional_test.go:1995: Checking for existence of /usr/share/ca-certificates/216612.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /usr/share/ca-certificates/216612.pem"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1996: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (1.52s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-142583 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.07s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2023: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo systemctl is-active docker"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh "sudo systemctl is-active docker": exit status 1 (309.503626ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2023: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo systemctl is-active crio"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh "sudo systemctl is-active crio": exit status 1 (245.915393ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.56s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.96s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2284: (dbg) Run:  out/minikube-linux-amd64 license
--- PASS: TestFunctional/parallel/License (0.96s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (10.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1435: (dbg) Run:  kubectl --context functional-142583 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1441: (dbg) Run:  kubectl --context functional-142583 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-6d85cfcfd8-2rdpl" [c8402f4f-3265-45cc-8adb-636ad73b8dd2] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-6d85cfcfd8-2rdpl" [c8402f4f-3265-45cc-8adb-636ad73b8dd2] Running
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 10.003941258s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (10.21s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1266: (dbg) Run:  out/minikube-linux-amd64 profile lis
functional_test.go:1271: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1306: (dbg) Run:  out/minikube-linux-amd64 profile list
functional_test.go:1311: Took "236.316038ms" to run "out/minikube-linux-amd64 profile list"
functional_test.go:1320: (dbg) Run:  out/minikube-linux-amd64 profile list -l
functional_test.go:1325: Took "55.297342ms" to run "out/minikube-linux-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1357: (dbg) Run:  out/minikube-linux-amd64 profile list -o json
functional_test.go:1362: Took "255.944823ms" to run "out/minikube-linux-amd64 profile list -o json"
functional_test.go:1370: (dbg) Run:  out/minikube-linux-amd64 profile list -o json --light
functional_test.go:1375: Took "49.64947ms" to run "out/minikube-linux-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (9.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdany-port3962404065/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1721237082553111361" to /tmp/TestFunctionalparallelMountCmdany-port3962404065/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1721237082553111361" to /tmp/TestFunctionalparallelMountCmdany-port3962404065/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1721237082553111361" to /tmp/TestFunctionalparallelMountCmdany-port3962404065/001/test-1721237082553111361
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (217.885534ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Jul 17 17:24 created-by-test
-rw-r--r-- 1 docker docker 24 Jul 17 17:24 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Jul 17 17:24 test-1721237082553111361
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh cat /mount-9p/test-1721237082553111361
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-142583 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [21ec070c-5f19-4c95-8ef3-f9e90a9223b5] Pending
helpers_test.go:344: "busybox-mount" [21ec070c-5f19-4c95-8ef3-f9e90a9223b5] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [21ec070c-5f19-4c95-8ef3-f9e90a9223b5] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "busybox-mount" [21ec070c-5f19-4c95-8ef3-f9e90a9223b5] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 7.004470843s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-142583 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdany-port3962404065/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (9.60s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2115: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2115: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2115: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-142583 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.30.2
registry.k8s.io/kube-proxy:v1.30.2
registry.k8s.io/kube-controller-manager:v1.30.2
registry.k8s.io/kube-apiserver:v1.30.2
registry.k8s.io/etcd:3.5.12-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.11.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
docker.io/library/minikube-local-cache-test:functional-142583
docker.io/kindest/kindnetd:v20240513-cd2ac642
docker.io/kicbase/echo-server:functional-142583
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-142583 image ls --format short --alsologtostderr:
I0717 17:25:05.046441   31331 out.go:291] Setting OutFile to fd 1 ...
I0717 17:25:05.046556   31331 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.046566   31331 out.go:304] Setting ErrFile to fd 2...
I0717 17:25:05.046573   31331 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.046889   31331 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
I0717 17:25:05.047615   31331 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.047755   31331 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.048306   31331 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.048356   31331 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.067298   31331 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34075
I0717 17:25:05.067736   31331 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.068295   31331 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.068320   31331 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.068720   31331 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.068909   31331 main.go:141] libmachine: (functional-142583) Calling .GetState
I0717 17:25:05.070848   31331 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.070899   31331 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.086261   31331 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46367
I0717 17:25:05.086708   31331 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.087266   31331 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.087305   31331 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.087730   31331 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.087905   31331 main.go:141] libmachine: (functional-142583) Calling .DriverName
I0717 17:25:05.088114   31331 ssh_runner.go:195] Run: systemctl --version
I0717 17:25:05.088147   31331 main.go:141] libmachine: (functional-142583) Calling .GetSSHHostname
I0717 17:25:05.090775   31331 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.091167   31331 main.go:141] libmachine: (functional-142583) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ae:93:54", ip: ""} in network mk-functional-142583: {Iface:virbr1 ExpiryTime:2024-07-17 18:21:49 +0000 UTC Type:0 Mac:52:54:00:ae:93:54 Iaid: IPaddr:192.168.50.115 Prefix:24 Hostname:functional-142583 Clientid:01:52:54:00:ae:93:54}
I0717 17:25:05.091209   31331 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined IP address 192.168.50.115 and MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.091440   31331 main.go:141] libmachine: (functional-142583) Calling .GetSSHPort
I0717 17:25:05.091598   31331 main.go:141] libmachine: (functional-142583) Calling .GetSSHKeyPath
I0717 17:25:05.091743   31331 main.go:141] libmachine: (functional-142583) Calling .GetSSHUsername
I0717 17:25:05.091877   31331 sshutil.go:53] new ssh client: &{IP:192.168.50.115 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/functional-142583/id_rsa Username:docker}
I0717 17:25:05.189031   31331 ssh_runner.go:195] Run: sudo crictl images --output json
I0717 17:25:05.243404   31331 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.243419   31331 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.243671   31331 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.243697   31331 main.go:141] libmachine: Making call to close connection to plugin binary
I0717 17:25:05.243705   31331 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:05.243708   31331 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.243720   31331 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.243925   31331 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:05.243971   31331 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.243986   31331 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-142583 image ls --format table --alsologtostderr:
|---------------------------------------------|--------------------|---------------|--------|
|                    Image                    |        Tag         |   Image ID    |  Size  |
|---------------------------------------------|--------------------|---------------|--------|
| registry.k8s.io/kube-controller-manager     | v1.30.2            | sha256:e87481 | 31.1MB |
| registry.k8s.io/kube-proxy                  | v1.30.2            | sha256:53c535 | 29MB   |
| registry.k8s.io/kube-scheduler              | v1.30.2            | sha256:7820c8 | 19.3MB |
| registry.k8s.io/pause                       | 3.3                | sha256:0184c1 | 298kB  |
| docker.io/kindest/kindnetd                  | v20240513-cd2ac642 | sha256:ac1c61 | 28.2MB |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc       | sha256:56cc51 | 2.4MB  |
| gcr.io/k8s-minikube/storage-provisioner     | v5                 | sha256:6e38f4 | 9.06MB |
| registry.k8s.io/coredns/coredns             | v1.11.1            | sha256:cbb01a | 18.2MB |
| registry.k8s.io/etcd                        | 3.5.12-0           | sha256:3861cf | 57.2MB |
| registry.k8s.io/pause                       | 3.9                | sha256:e6f181 | 322kB  |
| registry.k8s.io/pause                       | latest             | sha256:350b16 | 72.3kB |
| docker.io/library/minikube-local-cache-test | functional-142583  | sha256:adcd53 | 991B   |
| registry.k8s.io/kube-apiserver              | v1.30.2            | sha256:56ce0f | 32.8MB |
| docker.io/kicbase/echo-server               | functional-142583  | sha256:9056ab | 2.37MB |
| registry.k8s.io/echoserver                  | 1.8                | sha256:82e4c8 | 46.2MB |
| registry.k8s.io/pause                       | 3.1                | sha256:da86e6 | 315kB  |
|---------------------------------------------|--------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-142583 image ls --format table --alsologtostderr:
I0717 17:25:05.712410   31467 out.go:291] Setting OutFile to fd 1 ...
I0717 17:25:05.712529   31467 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.712540   31467 out.go:304] Setting ErrFile to fd 2...
I0717 17:25:05.712545   31467 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.712819   31467 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
I0717 17:25:05.713564   31467 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.713715   31467 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.714289   31467 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.714342   31467 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.729509   31467 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40903
I0717 17:25:05.730067   31467 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.730771   31467 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.730797   31467 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.731178   31467 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.731398   31467 main.go:141] libmachine: (functional-142583) Calling .GetState
I0717 17:25:05.733587   31467 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.733625   31467 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.748865   31467 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43589
I0717 17:25:05.749290   31467 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.749741   31467 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.749767   31467 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.750148   31467 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.750318   31467 main.go:141] libmachine: (functional-142583) Calling .DriverName
I0717 17:25:05.750533   31467 ssh_runner.go:195] Run: systemctl --version
I0717 17:25:05.750565   31467 main.go:141] libmachine: (functional-142583) Calling .GetSSHHostname
I0717 17:25:05.753194   31467 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.753560   31467 main.go:141] libmachine: (functional-142583) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ae:93:54", ip: ""} in network mk-functional-142583: {Iface:virbr1 ExpiryTime:2024-07-17 18:21:49 +0000 UTC Type:0 Mac:52:54:00:ae:93:54 Iaid: IPaddr:192.168.50.115 Prefix:24 Hostname:functional-142583 Clientid:01:52:54:00:ae:93:54}
I0717 17:25:05.753592   31467 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined IP address 192.168.50.115 and MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.753710   31467 main.go:141] libmachine: (functional-142583) Calling .GetSSHPort
I0717 17:25:05.753898   31467 main.go:141] libmachine: (functional-142583) Calling .GetSSHKeyPath
I0717 17:25:05.754033   31467 main.go:141] libmachine: (functional-142583) Calling .GetSSHUsername
I0717 17:25:05.754199   31467 sshutil.go:53] new ssh client: &{IP:192.168.50.115 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/functional-142583/id_rsa Username:docker}
I0717 17:25:05.858661   31467 ssh_runner.go:195] Run: sudo crictl images --output json
I0717 17:25:05.911974   31467 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.911993   31467 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.912307   31467 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:05.912324   31467 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.912346   31467 main.go:141] libmachine: Making call to close connection to plugin binary
I0717 17:25:05.912365   31467 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.912379   31467 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.912610   31467 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.912625   31467 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:05.912626   31467 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-142583 image ls --format json --alsologtostderr:
[{"id":"sha256:56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"2395207"},{"id":"sha256:82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":["registry.k8s.io/echoserver@sha256:cb3386f863f6a4b05f33c191361723f9d5927ac287463b1bea633bf859475969"],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"46237695"},{"id":"sha256:ac1c61439df4625ba53a9ceaccb5eb07a830bdf942cc1c60535a4dd7e763d55f","repoDigests":["docker.io/kindest/kindnetd@sha256:9c2b5fcda3cb5a9725ecb893f3c8998a92d51a87465a886eb563e18d649383a8"],"repoTags":["docker.io/kindest/kindnetd:v20240513-cd2ac642"],"size":"28194900"},{"id":"sha256:9056ab77afb8e18e04303f11000a9d31b3f16b74c59475b899ae1b342d328d30","repoDigests":[],"repoTags":["docker.io/kicbase/echo-server:functional-142583"],"size":"2372971"},{"id":"sha256:adcd53ac0371
66c778f754176362bc301ff4392ef07a87909888a864f2fb230f","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-142583"],"size":"991"},{"id":"sha256:7820c83aa139453522e9028341d0d4f23ca2721ec80c7a47425446d11157b940","repoDigests":["registry.k8s.io/kube-scheduler@sha256:0ed75a333704f5d315395c6ec04d7af7405715537069b65d40b43ec1c8e030bc"],"repoTags":["registry.k8s.io/kube-scheduler:v1.30.2"],"size":"19328121"},{"id":"sha256:0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"297686"},{"id":"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":["registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097"],"repoTags":["registry.k8s.io/pause:3.9"],"size":"321520"},{"id":"sha256:115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":["docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b2
2d29f356092ce206e98765c"],"repoTags":[],"size":"19746404"},{"id":"sha256:53c535741fb446f6b34d720fdc5748db368ef96771111f3892682e6eab8f3772","repoDigests":["registry.k8s.io/kube-proxy@sha256:8a44c6e094af3dea3de57fa967e201608a358a3bd8b4e3f31ab905bbe4108aec"],"repoTags":["registry.k8s.io/kube-proxy:v1.30.2"],"size":"29034457"},{"id":"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899","repoDigests":["registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b"],"repoTags":["registry.k8s.io/etcd:3.5.12-0"],"size":"57236178"},{"id":"sha256:56ce0fd9fb532bcb552ddbdbe3064189ce823a71693d97ff7a0a7a4ff6bffbbe","repoDigests":["registry.k8s.io/kube-apiserver@sha256:340ab4a1d66a60630a7a298aa0b2576fcd82e51ecdddb751cf61e5d3846fde2d"],"repoTags":["registry.k8s.io/kube-apiserver:v1.30.2"],"size":"32768601"},{"id":"sha256:e874818b3caac34f68704eb96bf248d0c8116b1262ab549d45d39dd3dd775974","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:4c412bc1fc585ddeba10d34a02e
7507ea787ec2c57256d4c18fd230377ab048e"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.30.2"],"size":"31138657"},{"id":"sha256:da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"315399"},{"id":"sha256:350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"72306"},{"id":"sha256:07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558","repoDigests":["docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93"],"repoTags":[],"size":"75788960"},{"id":"sha256:6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"9058936"},{"id":"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5b
c34fcc6bb4","repoDigests":["registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1"],"repoTags":["registry.k8s.io/coredns/coredns:v1.11.1"],"size":"18182961"}]
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-142583 image ls --format json --alsologtostderr:
I0717 17:25:05.463812   31421 out.go:291] Setting OutFile to fd 1 ...
I0717 17:25:05.463904   31421 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.463911   31421 out.go:304] Setting ErrFile to fd 2...
I0717 17:25:05.463915   31421 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.464126   31421 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
I0717 17:25:05.464633   31421 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.464727   31421 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.465094   31421 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.465132   31421 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.481080   31421 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34751
I0717 17:25:05.481477   31421 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.482079   31421 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.482108   31421 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.482430   31421 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.482645   31421 main.go:141] libmachine: (functional-142583) Calling .GetState
I0717 17:25:05.484803   31421 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.484851   31421 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.501374   31421 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37169
I0717 17:25:05.501838   31421 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.502380   31421 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.502405   31421 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.502666   31421 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.502854   31421 main.go:141] libmachine: (functional-142583) Calling .DriverName
I0717 17:25:05.503055   31421 ssh_runner.go:195] Run: systemctl --version
I0717 17:25:05.503096   31421 main.go:141] libmachine: (functional-142583) Calling .GetSSHHostname
I0717 17:25:05.506132   31421 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.506579   31421 main.go:141] libmachine: (functional-142583) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ae:93:54", ip: ""} in network mk-functional-142583: {Iface:virbr1 ExpiryTime:2024-07-17 18:21:49 +0000 UTC Type:0 Mac:52:54:00:ae:93:54 Iaid: IPaddr:192.168.50.115 Prefix:24 Hostname:functional-142583 Clientid:01:52:54:00:ae:93:54}
I0717 17:25:05.506621   31421 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined IP address 192.168.50.115 and MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.506782   31421 main.go:141] libmachine: (functional-142583) Calling .GetSSHPort
I0717 17:25:05.506933   31421 main.go:141] libmachine: (functional-142583) Calling .GetSSHKeyPath
I0717 17:25:05.507086   31421 main.go:141] libmachine: (functional-142583) Calling .GetSSHUsername
I0717 17:25:05.507218   31421 sshutil.go:53] new ssh client: &{IP:192.168.50.115 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/functional-142583/id_rsa Username:docker}
I0717 17:25:05.600995   31421 ssh_runner.go:195] Run: sudo crictl images --output json
I0717 17:25:05.659222   31421 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.659243   31421 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.659501   31421 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.659524   31421 main.go:141] libmachine: Making call to close connection to plugin binary
I0717 17:25:05.659528   31421 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:05.659533   31421 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.659544   31421 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.659833   31421 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.659847   31421 main.go:141] libmachine: Making call to close connection to plugin binary
I0717 17:25:05.659847   31421 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-142583 image ls --format yaml --alsologtostderr:
- id: sha256:56ce0fd9fb532bcb552ddbdbe3064189ce823a71693d97ff7a0a7a4ff6bffbbe
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:340ab4a1d66a60630a7a298aa0b2576fcd82e51ecdddb751cf61e5d3846fde2d
repoTags:
- registry.k8s.io/kube-apiserver:v1.30.2
size: "32768601"
- id: sha256:9056ab77afb8e18e04303f11000a9d31b3f16b74c59475b899ae1b342d328d30
repoDigests: []
repoTags:
- docker.io/kicbase/echo-server:functional-142583
size: "2372971"
- id: sha256:adcd53ac037166c778f754176362bc301ff4392ef07a87909888a864f2fb230f
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-142583
size: "991"
- id: sha256:82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests:
- registry.k8s.io/echoserver@sha256:cb3386f863f6a4b05f33c191361723f9d5927ac287463b1bea633bf859475969
repoTags:
- registry.k8s.io/echoserver:1.8
size: "46237695"
- id: sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899
repoDigests:
- registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b
repoTags:
- registry.k8s.io/etcd:3.5.12-0
size: "57236178"
- id: sha256:0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "297686"
- id: sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests:
- registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097
repoTags:
- registry.k8s.io/pause:3.9
size: "321520"
- id: sha256:ac1c61439df4625ba53a9ceaccb5eb07a830bdf942cc1c60535a4dd7e763d55f
repoDigests:
- docker.io/kindest/kindnetd@sha256:9c2b5fcda3cb5a9725ecb893f3c8998a92d51a87465a886eb563e18d649383a8
repoTags:
- docker.io/kindest/kindnetd:v20240513-cd2ac642
size: "28194900"
- id: sha256:07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558
repoDigests:
- docker.io/kubernetesui/dashboard@sha256:2e500d29e9d5f4a086b908eb8dfe7ecac57d2ab09d65b24f588b1d449841ef93
repoTags: []
size: "75788960"
- id: sha256:115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7
repoDigests:
- docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c
repoTags: []
size: "19746404"
- id: sha256:56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "2395207"
- id: sha256:7820c83aa139453522e9028341d0d4f23ca2721ec80c7a47425446d11157b940
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:0ed75a333704f5d315395c6ec04d7af7405715537069b65d40b43ec1c8e030bc
repoTags:
- registry.k8s.io/kube-scheduler:v1.30.2
size: "19328121"
- id: sha256:da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "315399"
- id: sha256:350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "72306"
- id: sha256:6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "9058936"
- id: sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1
repoTags:
- registry.k8s.io/coredns/coredns:v1.11.1
size: "18182961"
- id: sha256:e874818b3caac34f68704eb96bf248d0c8116b1262ab549d45d39dd3dd775974
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:4c412bc1fc585ddeba10d34a02e7507ea787ec2c57256d4c18fd230377ab048e
repoTags:
- registry.k8s.io/kube-controller-manager:v1.30.2
size: "31138657"
- id: sha256:53c535741fb446f6b34d720fdc5748db368ef96771111f3892682e6eab8f3772
repoDigests:
- registry.k8s.io/kube-proxy@sha256:8a44c6e094af3dea3de57fa967e201608a358a3bd8b4e3f31ab905bbe4108aec
repoTags:
- registry.k8s.io/kube-proxy:v1.30.2
size: "29034457"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-142583 image ls --format yaml --alsologtostderr:
I0717 17:25:05.175348   31366 out.go:291] Setting OutFile to fd 1 ...
I0717 17:25:05.175477   31366 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.175487   31366 out.go:304] Setting ErrFile to fd 2...
I0717 17:25:05.175494   31366 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.175683   31366 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
I0717 17:25:05.176224   31366 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.176338   31366 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.176707   31366 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.176752   31366 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.191605   31366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33587
I0717 17:25:05.192073   31366 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.192690   31366 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.192712   31366 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.193072   31366 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.193286   31366 main.go:141] libmachine: (functional-142583) Calling .GetState
I0717 17:25:05.195204   31366 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.195259   31366 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.210325   31366 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39923
I0717 17:25:05.210710   31366 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.211267   31366 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.211296   31366 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.211609   31366 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.211815   31366 main.go:141] libmachine: (functional-142583) Calling .DriverName
I0717 17:25:05.212023   31366 ssh_runner.go:195] Run: systemctl --version
I0717 17:25:05.212058   31366 main.go:141] libmachine: (functional-142583) Calling .GetSSHHostname
I0717 17:25:05.215000   31366 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.215430   31366 main.go:141] libmachine: (functional-142583) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ae:93:54", ip: ""} in network mk-functional-142583: {Iface:virbr1 ExpiryTime:2024-07-17 18:21:49 +0000 UTC Type:0 Mac:52:54:00:ae:93:54 Iaid: IPaddr:192.168.50.115 Prefix:24 Hostname:functional-142583 Clientid:01:52:54:00:ae:93:54}
I0717 17:25:05.215462   31366 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined IP address 192.168.50.115 and MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.215593   31366 main.go:141] libmachine: (functional-142583) Calling .GetSSHPort
I0717 17:25:05.215768   31366 main.go:141] libmachine: (functional-142583) Calling .GetSSHKeyPath
I0717 17:25:05.215895   31366 main.go:141] libmachine: (functional-142583) Calling .GetSSHUsername
I0717 17:25:05.216020   31366 sshutil.go:53] new ssh client: &{IP:192.168.50.115 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/functional-142583/id_rsa Username:docker}
I0717 17:25:05.317876   31366 ssh_runner.go:195] Run: sudo crictl images --output json
I0717 17:25:05.417394   31366 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.417411   31366 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.417683   31366 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.417706   31366 main.go:141] libmachine: Making call to close connection to plugin binary
I0717 17:25:05.417716   31366 main.go:141] libmachine: Making call to close driver server
I0717 17:25:05.417723   31366 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:05.417723   31366 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:05.417994   31366 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:05.418048   31366 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (4.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh pgrep buildkitd: exit status 1 (218.05056ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image build -t localhost/my-image:functional-142583 testdata/build --alsologtostderr
functional_test.go:314: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 image build -t localhost/my-image:functional-142583 testdata/build --alsologtostderr: (4.246658765s)
functional_test.go:322: (dbg) Stderr: out/minikube-linux-amd64 -p functional-142583 image build -t localhost/my-image:functional-142583 testdata/build --alsologtostderr:
I0717 17:25:05.513110   31433 out.go:291] Setting OutFile to fd 1 ...
I0717 17:25:05.513265   31433 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.513275   31433 out.go:304] Setting ErrFile to fd 2...
I0717 17:25:05.513280   31433 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0717 17:25:05.513476   31433 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
I0717 17:25:05.514046   31433 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.514667   31433 config.go:182] Loaded profile config "functional-142583": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
I0717 17:25:05.515226   31433 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.515275   31433 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.530743   31433 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38395
I0717 17:25:05.531175   31433 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.531726   31433 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.531767   31433 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.532183   31433 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.532376   31433 main.go:141] libmachine: (functional-142583) Calling .GetState
I0717 17:25:05.534138   31433 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0717 17:25:05.534178   31433 main.go:141] libmachine: Launching plugin server for driver kvm2
I0717 17:25:05.548198   31433 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38711
I0717 17:25:05.548514   31433 main.go:141] libmachine: () Calling .GetVersion
I0717 17:25:05.548950   31433 main.go:141] libmachine: Using API Version  1
I0717 17:25:05.548970   31433 main.go:141] libmachine: () Calling .SetConfigRaw
I0717 17:25:05.549272   31433 main.go:141] libmachine: () Calling .GetMachineName
I0717 17:25:05.549444   31433 main.go:141] libmachine: (functional-142583) Calling .DriverName
I0717 17:25:05.549635   31433 ssh_runner.go:195] Run: systemctl --version
I0717 17:25:05.549667   31433 main.go:141] libmachine: (functional-142583) Calling .GetSSHHostname
I0717 17:25:05.552420   31433 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.552834   31433 main.go:141] libmachine: (functional-142583) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ae:93:54", ip: ""} in network mk-functional-142583: {Iface:virbr1 ExpiryTime:2024-07-17 18:21:49 +0000 UTC Type:0 Mac:52:54:00:ae:93:54 Iaid: IPaddr:192.168.50.115 Prefix:24 Hostname:functional-142583 Clientid:01:52:54:00:ae:93:54}
I0717 17:25:05.552857   31433 main.go:141] libmachine: (functional-142583) DBG | domain functional-142583 has defined IP address 192.168.50.115 and MAC address 52:54:00:ae:93:54 in network mk-functional-142583
I0717 17:25:05.553135   31433 main.go:141] libmachine: (functional-142583) Calling .GetSSHPort
I0717 17:25:05.553342   31433 main.go:141] libmachine: (functional-142583) Calling .GetSSHKeyPath
I0717 17:25:05.553514   31433 main.go:141] libmachine: (functional-142583) Calling .GetSSHUsername
I0717 17:25:05.553645   31433 sshutil.go:53] new ssh client: &{IP:192.168.50.115 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/functional-142583/id_rsa Username:docker}
I0717 17:25:05.650324   31433 build_images.go:161] Building image from path: /tmp/build.2119363164.tar
I0717 17:25:05.650383   31433 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0717 17:25:05.672029   31433 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.2119363164.tar
I0717 17:25:05.677595   31433 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.2119363164.tar: stat -c "%s %y" /var/lib/minikube/build/build.2119363164.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.2119363164.tar': No such file or directory
I0717 17:25:05.677637   31433 ssh_runner.go:362] scp /tmp/build.2119363164.tar --> /var/lib/minikube/build/build.2119363164.tar (3072 bytes)
I0717 17:25:05.715007   31433 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.2119363164
I0717 17:25:05.728285   31433 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.2119363164 -xf /var/lib/minikube/build/build.2119363164.tar
I0717 17:25:05.740475   31433 containerd.go:394] Building image: /var/lib/minikube/build/build.2119363164
I0717 17:25:05.740537   31433 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2119363164 --local dockerfile=/var/lib/minikube/build/build.2119363164 --output type=image,name=localhost/my-image:functional-142583
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.0s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 1.5s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.0s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0B / 772.79kB 0.2s
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 772.79kB / 772.79kB 0.8s
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 772.79kB / 772.79kB 0.9s done
#5 extracting sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0.1s done
#5 DONE 1.1s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 0.5s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.1s

                                                
                                                
#8 exporting to image
#8 exporting layers
#8 exporting layers 0.2s done
#8 exporting manifest sha256:e097a72720c48963adb69a5e4062bd58c668ebcba6ceaf1789008e88e6f0b49d
#8 exporting manifest sha256:e097a72720c48963adb69a5e4062bd58c668ebcba6ceaf1789008e88e6f0b49d 0.0s done
#8 exporting config sha256:dcc395097395f94c14b8392648f767c9eaa26394a25cfd10e8c4ea538a8679a7 0.0s done
#8 naming to localhost/my-image:functional-142583 done
#8 DONE 0.2s
I0717 17:25:09.675554   31433 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2119363164 --local dockerfile=/var/lib/minikube/build/build.2119363164 --output type=image,name=localhost/my-image:functional-142583: (3.934987812s)
I0717 17:25:09.675634   31433 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.2119363164
I0717 17:25:09.690501   31433 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.2119363164.tar
I0717 17:25:09.708439   31433 build_images.go:217] Built localhost/my-image:functional-142583 from /tmp/build.2119363164.tar
I0717 17:25:09.708467   31433 build_images.go:133] succeeded building to: functional-142583
I0717 17:25:09.708472   31433 build_images.go:134] failed building to: 
I0717 17:25:09.708497   31433 main.go:141] libmachine: Making call to close driver server
I0717 17:25:09.708513   31433 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:09.708759   31433 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:09.708774   31433 main.go:141] libmachine: Making call to close connection to plugin binary
I0717 17:25:09.708782   31433 main.go:141] libmachine: Making call to close driver server
I0717 17:25:09.708791   31433 main.go:141] libmachine: (functional-142583) Calling .Close
I0717 17:25:09.708792   31433 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:09.709080   31433 main.go:141] libmachine: (functional-142583) DBG | Closing plugin on server side
I0717 17:25:09.709123   31433 main.go:141] libmachine: Successfully made call to close driver server
I0717 17:25:09.709172   31433 main.go:141] libmachine: Making call to close connection to plugin binary
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (4.68s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (2.75s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull docker.io/kicbase/echo-server:1.0
functional_test.go:341: (dbg) Done: docker pull docker.io/kicbase/echo-server:1.0: (2.734292176s)
functional_test.go:346: (dbg) Run:  docker tag docker.io/kicbase/echo-server:1.0 docker.io/kicbase/echo-server:functional-142583
--- PASS: TestFunctional/parallel/ImageCommands/Setup (2.75s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.73s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image load --daemon docker.io/kicbase/echo-server:functional-142583 --alsologtostderr
functional_test.go:354: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 image load --daemon docker.io/kicbase/echo-server:functional-142583 --alsologtostderr: (1.522931897s)
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (1.73s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image load --daemon docker.io/kicbase/echo-server:functional-142583 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (1.06s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (2.62s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull docker.io/kicbase/echo-server:latest
functional_test.go:234: (dbg) Done: docker pull docker.io/kicbase/echo-server:latest: (1.180837845s)
functional_test.go:239: (dbg) Run:  docker tag docker.io/kicbase/echo-server:latest docker.io/kicbase/echo-server:functional-142583
functional_test.go:244: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image load --daemon docker.io/kicbase/echo-server:functional-142583 --alsologtostderr
functional_test.go:244: (dbg) Done: out/minikube-linux-amd64 -p functional-142583 image load --daemon docker.io/kicbase/echo-server:functional-142583 --alsologtostderr: (1.036277683s)
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (2.62s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1455: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdspecific-port3178525559/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (238.344129ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdspecific-port3178525559/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh "sudo umount -f /mount-9p": exit status 1 (276.535925ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-amd64 -p functional-142583 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdspecific-port3178525559/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.68s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1485: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 service list -o json
functional_test.go:1490: Took "492.426522ms" to run "out/minikube-linux-amd64 -p functional-142583 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1505: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 service --namespace=default --https --url hello-node
functional_test.go:1518: found endpoint: https://192.168.50.115:31833
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.3s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1536: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.30s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.99s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image save docker.io/kicbase/echo-server:functional-142583 /home/jenkins/workspace/KVM_Linux_containerd_integration/echo-server-save.tar --alsologtostderr
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (0.99s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1555: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 service hello-node --url
functional_test.go:1561: found endpoint for hello-node: http://192.168.50.115:31833
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.46s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (1.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1875143938/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1875143938/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1875143938/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T" /mount1: exit status 1 (324.20299ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-amd64 mount -p functional-142583 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1875143938/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1875143938/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-142583 /tmp/TestFunctionalparallelMountCmdVerifyCleanup1875143938/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (1.68s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2252: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 version --short
--- PASS: TestFunctional/parallel/Version/short (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.64s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2266: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.64s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image rm docker.io/kicbase/echo-server:functional-142583 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image load /home/jenkins/workspace/KVM_Linux_containerd_integration/echo-server-save.tar --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.01s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi docker.io/kicbase/echo-server:functional-142583
functional_test.go:423: (dbg) Run:  out/minikube-linux-amd64 -p functional-142583 image save --daemon docker.io/kicbase/echo-server:functional-142583 --alsologtostderr
functional_test.go:428: (dbg) Run:  docker image inspect docker.io/kicbase/echo-server:functional-142583
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (0.68s)

                                                
                                    
x
+
TestFunctional/delete_echo-server_images (0.03s)

                                                
                                                
=== RUN   TestFunctional/delete_echo-server_images
functional_test.go:189: (dbg) Run:  docker rmi -f docker.io/kicbase/echo-server:1.0
functional_test.go:189: (dbg) Run:  docker rmi -f docker.io/kicbase/echo-server:functional-142583
--- PASS: TestFunctional/delete_echo-server_images (0.03s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-142583
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.01s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-142583
--- PASS: TestFunctional/delete_minikube_cached_images (0.01s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.06s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-333994 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.06s)

                                                
                                    
x
+
TestJSONOutput/start/Command (96.7s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 start -p json-output-140730 --output=json --user=testUser --memory=2200 --wait=true --driver=kvm2  --container-runtime=containerd
json_output_test.go:63: (dbg) Done: out/minikube-linux-amd64 start -p json-output-140730 --output=json --user=testUser --memory=2200 --wait=true --driver=kvm2  --container-runtime=containerd: (1m36.699760169s)
--- PASS: TestJSONOutput/start/Command (96.70s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.7s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 pause -p json-output-140730 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.70s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.63s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 unpause -p json-output-140730 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.63s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (6.5s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 stop -p json-output-140730 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-amd64 stop -p json-output-140730 --output=json --user=testUser: (6.498234276s)
--- PASS: TestJSONOutput/stop/Command (6.50s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.18s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-amd64 start -p json-output-error-845639 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p json-output-error-845639 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (57.950597ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"40509caf-850b-4bad-936f-3700ae231e0c","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-845639] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"cebba043-0d5f-4725-a501-f31797ab8685","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=19283"}}
	{"specversion":"1.0","id":"aef10b35-6431-4b4d-b6c3-acf1514e3e17","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"fb650ac7-2c9f-46b3-9f4a-1bd1c32df02e","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig"}}
	{"specversion":"1.0","id":"275f8272-7ac1-4093-8a52-68af4f2fbdab","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube"}}
	{"specversion":"1.0","id":"f4dc218f-9dbf-4c60-9afa-7e392c3d3f3a","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-amd64"}}
	{"specversion":"1.0","id":"0e657fa3-abe5-456b-a28a-9e3570f643ef","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"c475c23d-ece4-40a7-b4ff-022d97186d48","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-845639" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p json-output-error-845639
--- PASS: TestErrorJSONOutput (0.18s)

                                                
                                    
x
+
TestMainNoArgs (0.04s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-linux-amd64
--- PASS: TestMainNoArgs (0.04s)

                                                
                                    
x
+
TestMinikubeProfile (91.81s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-amd64 start -p first-912211 --driver=kvm2  --container-runtime=containerd
E0717 17:57:52.134196   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-amd64 start -p first-912211 --driver=kvm2  --container-runtime=containerd: (40.839291681s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-amd64 start -p second-914483 --driver=kvm2  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-amd64 start -p second-914483 --driver=kvm2  --container-runtime=containerd: (48.407485176s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-amd64 profile first-912211
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-amd64 profile second-914483
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-914483" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p second-914483
helpers_test.go:175: Cleaning up "first-912211" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p first-912211
--- PASS: TestMinikubeProfile (91.81s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (27.8s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-linux-amd64 start -p mount-start-1-344414 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd
mount_start_test.go:98: (dbg) Done: out/minikube-linux-amd64 start -p mount-start-1-344414 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (26.801810057s)
--- PASS: TestMountStart/serial/StartWithMountFirst (27.80s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.37s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-1-344414 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-1-344414 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.37s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (32.04s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-linux-amd64 start -p mount-start-2-361291 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd
E0717 17:59:41.797097   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-linux-amd64 start -p mount-start-2-361291 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (31.035140962s)
--- PASS: TestMountStart/serial/StartWithMountSecond (32.04s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.35s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-361291 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-361291 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.35s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (0.68s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-amd64 delete -p mount-start-1-344414 --alsologtostderr -v=5
--- PASS: TestMountStart/serial/DeleteFirst (0.68s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.35s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-361291 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-361291 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.35s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.58s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-linux-amd64 stop -p mount-start-2-361291
mount_start_test.go:155: (dbg) Done: out/minikube-linux-amd64 stop -p mount-start-2-361291: (1.583755084s)
--- PASS: TestMountStart/serial/Stop (1.58s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (22.57s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-linux-amd64 start -p mount-start-2-361291
mount_start_test.go:166: (dbg) Done: out/minikube-linux-amd64 start -p mount-start-2-361291: (21.56938675s)
--- PASS: TestMountStart/serial/RestartStopped (22.57s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.36s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-361291 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-361291 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.36s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (124.42s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-832048 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd
multinode_test.go:96: (dbg) Done: out/minikube-linux-amd64 start -p multinode-832048 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd: (2m4.035676536s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (124.42s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (5.82s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
E0717 18:02:44.846968   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
multinode_test.go:498: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-amd64 kubectl -p multinode-832048 -- rollout status deployment/busybox: (4.278465171s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-8tgwm -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-k2gpj -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-8tgwm -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-k2gpj -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-8tgwm -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-k2gpj -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (5.82s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.75s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-8tgwm -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-8tgwm -- sh -c "ping -c 1 192.168.39.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-k2gpj -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-832048 -- exec busybox-fc5497c4f-k2gpj -- sh -c "ping -c 1 192.168.39.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.75s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (52.29s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-amd64 node add -p multinode-832048 -v 3 --alsologtostderr
E0717 18:02:52.133982   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
multinode_test.go:121: (dbg) Done: out/minikube-linux-amd64 node add -p multinode-832048 -v 3 --alsologtostderr: (51.742575402s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (52.29s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.06s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-832048 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.06s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.2s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.20s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (6.83s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp testdata/cp-test.txt multinode-832048:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile949945905/001/cp-test_multinode-832048.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048:/home/docker/cp-test.txt multinode-832048-m02:/home/docker/cp-test_multinode-832048_multinode-832048-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m02 "sudo cat /home/docker/cp-test_multinode-832048_multinode-832048-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048:/home/docker/cp-test.txt multinode-832048-m03:/home/docker/cp-test_multinode-832048_multinode-832048-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m03 "sudo cat /home/docker/cp-test_multinode-832048_multinode-832048-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp testdata/cp-test.txt multinode-832048-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile949945905/001/cp-test_multinode-832048-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048-m02:/home/docker/cp-test.txt multinode-832048:/home/docker/cp-test_multinode-832048-m02_multinode-832048.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048 "sudo cat /home/docker/cp-test_multinode-832048-m02_multinode-832048.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048-m02:/home/docker/cp-test.txt multinode-832048-m03:/home/docker/cp-test_multinode-832048-m02_multinode-832048-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m03 "sudo cat /home/docker/cp-test_multinode-832048-m02_multinode-832048-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp testdata/cp-test.txt multinode-832048-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile949945905/001/cp-test_multinode-832048-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048-m03:/home/docker/cp-test.txt multinode-832048:/home/docker/cp-test_multinode-832048-m03_multinode-832048.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048 "sudo cat /home/docker/cp-test_multinode-832048-m03_multinode-832048.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 cp multinode-832048-m03:/home/docker/cp-test.txt multinode-832048-m02:/home/docker/cp-test_multinode-832048-m03_multinode-832048-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 ssh -n multinode-832048-m02 "sudo cat /home/docker/cp-test_multinode-832048-m03_multinode-832048-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (6.83s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.13s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-amd64 -p multinode-832048 node stop m03: (1.298298454s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-832048 status: exit status 7 (413.952319ms)

                                                
                                                
-- stdout --
	multinode-832048
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-832048-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-832048-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr: exit status 7 (412.363437ms)

                                                
                                                
-- stdout --
	multinode-832048
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-832048-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-832048-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 18:03:52.449702   47955 out.go:291] Setting OutFile to fd 1 ...
	I0717 18:03:52.449823   47955 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 18:03:52.449832   47955 out.go:304] Setting ErrFile to fd 2...
	I0717 18:03:52.449839   47955 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 18:03:52.450028   47955 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 18:03:52.450236   47955 out.go:298] Setting JSON to false
	I0717 18:03:52.450265   47955 mustload.go:65] Loading cluster: multinode-832048
	I0717 18:03:52.450364   47955 notify.go:220] Checking for updates...
	I0717 18:03:52.450646   47955 config.go:182] Loaded profile config "multinode-832048": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 18:03:52.450669   47955 status.go:255] checking status of multinode-832048 ...
	I0717 18:03:52.451077   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.451120   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.470427   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37307
	I0717 18:03:52.470888   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.471480   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.471512   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.471907   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.472094   47955 main.go:141] libmachine: (multinode-832048) Calling .GetState
	I0717 18:03:52.473658   47955 status.go:330] multinode-832048 host status = "Running" (err=<nil>)
	I0717 18:03:52.473674   47955 host.go:66] Checking if "multinode-832048" exists ...
	I0717 18:03:52.473944   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.473980   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.489664   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41535
	I0717 18:03:52.490264   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.490726   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.490747   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.491058   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.491251   47955 main.go:141] libmachine: (multinode-832048) Calling .GetIP
	I0717 18:03:52.493981   47955 main.go:141] libmachine: (multinode-832048) DBG | domain multinode-832048 has defined MAC address 52:54:00:43:c9:d0 in network mk-multinode-832048
	I0717 18:03:52.494428   47955 main.go:141] libmachine: (multinode-832048) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:43:c9:d0", ip: ""} in network mk-multinode-832048: {Iface:virbr1 ExpiryTime:2024-07-17 19:00:54 +0000 UTC Type:0 Mac:52:54:00:43:c9:d0 Iaid: IPaddr:192.168.39.158 Prefix:24 Hostname:multinode-832048 Clientid:01:52:54:00:43:c9:d0}
	I0717 18:03:52.494459   47955 main.go:141] libmachine: (multinode-832048) DBG | domain multinode-832048 has defined IP address 192.168.39.158 and MAC address 52:54:00:43:c9:d0 in network mk-multinode-832048
	I0717 18:03:52.494622   47955 host.go:66] Checking if "multinode-832048" exists ...
	I0717 18:03:52.494929   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.494964   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.509928   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34765
	I0717 18:03:52.510403   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.510857   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.510874   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.511197   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.511345   47955 main.go:141] libmachine: (multinode-832048) Calling .DriverName
	I0717 18:03:52.511532   47955 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 18:03:52.511552   47955 main.go:141] libmachine: (multinode-832048) Calling .GetSSHHostname
	I0717 18:03:52.514304   47955 main.go:141] libmachine: (multinode-832048) DBG | domain multinode-832048 has defined MAC address 52:54:00:43:c9:d0 in network mk-multinode-832048
	I0717 18:03:52.514700   47955 main.go:141] libmachine: (multinode-832048) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:43:c9:d0", ip: ""} in network mk-multinode-832048: {Iface:virbr1 ExpiryTime:2024-07-17 19:00:54 +0000 UTC Type:0 Mac:52:54:00:43:c9:d0 Iaid: IPaddr:192.168.39.158 Prefix:24 Hostname:multinode-832048 Clientid:01:52:54:00:43:c9:d0}
	I0717 18:03:52.514728   47955 main.go:141] libmachine: (multinode-832048) DBG | domain multinode-832048 has defined IP address 192.168.39.158 and MAC address 52:54:00:43:c9:d0 in network mk-multinode-832048
	I0717 18:03:52.515000   47955 main.go:141] libmachine: (multinode-832048) Calling .GetSSHPort
	I0717 18:03:52.515180   47955 main.go:141] libmachine: (multinode-832048) Calling .GetSSHKeyPath
	I0717 18:03:52.515327   47955 main.go:141] libmachine: (multinode-832048) Calling .GetSSHUsername
	I0717 18:03:52.515447   47955 sshutil.go:53] new ssh client: &{IP:192.168.39.158 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/multinode-832048/id_rsa Username:docker}
	I0717 18:03:52.594318   47955 ssh_runner.go:195] Run: systemctl --version
	I0717 18:03:52.600422   47955 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 18:03:52.616986   47955 kubeconfig.go:125] found "multinode-832048" server: "https://192.168.39.158:8443"
	I0717 18:03:52.617019   47955 api_server.go:166] Checking apiserver status ...
	I0717 18:03:52.617072   47955 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0717 18:03:52.631071   47955 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1128/cgroup
	W0717 18:03:52.641436   47955 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1128/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0717 18:03:52.641494   47955 ssh_runner.go:195] Run: ls
	I0717 18:03:52.645840   47955 api_server.go:253] Checking apiserver healthz at https://192.168.39.158:8443/healthz ...
	I0717 18:03:52.649926   47955 api_server.go:279] https://192.168.39.158:8443/healthz returned 200:
	ok
	I0717 18:03:52.649951   47955 status.go:422] multinode-832048 apiserver status = Running (err=<nil>)
	I0717 18:03:52.649963   47955 status.go:257] multinode-832048 status: &{Name:multinode-832048 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 18:03:52.650004   47955 status.go:255] checking status of multinode-832048-m02 ...
	I0717 18:03:52.650344   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.650384   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.665792   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42795
	I0717 18:03:52.666182   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.666588   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.666607   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.666918   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.667088   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .GetState
	I0717 18:03:52.668518   47955 status.go:330] multinode-832048-m02 host status = "Running" (err=<nil>)
	I0717 18:03:52.668535   47955 host.go:66] Checking if "multinode-832048-m02" exists ...
	I0717 18:03:52.668801   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.668834   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.683516   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40385
	I0717 18:03:52.683904   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.684350   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.684376   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.684684   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.684844   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .GetIP
	I0717 18:03:52.687465   47955 main.go:141] libmachine: (multinode-832048-m02) DBG | domain multinode-832048-m02 has defined MAC address 52:54:00:dd:bd:dd in network mk-multinode-832048
	I0717 18:03:52.687896   47955 main.go:141] libmachine: (multinode-832048-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:dd:bd:dd", ip: ""} in network mk-multinode-832048: {Iface:virbr1 ExpiryTime:2024-07-17 19:02:06 +0000 UTC Type:0 Mac:52:54:00:dd:bd:dd Iaid: IPaddr:192.168.39.105 Prefix:24 Hostname:multinode-832048-m02 Clientid:01:52:54:00:dd:bd:dd}
	I0717 18:03:52.687919   47955 main.go:141] libmachine: (multinode-832048-m02) DBG | domain multinode-832048-m02 has defined IP address 192.168.39.105 and MAC address 52:54:00:dd:bd:dd in network mk-multinode-832048
	I0717 18:03:52.688091   47955 host.go:66] Checking if "multinode-832048-m02" exists ...
	I0717 18:03:52.688373   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.688418   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.704306   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35729
	I0717 18:03:52.704802   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.705321   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.705348   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.705655   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.705838   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .DriverName
	I0717 18:03:52.706037   47955 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0717 18:03:52.706055   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .GetSSHHostname
	I0717 18:03:52.709180   47955 main.go:141] libmachine: (multinode-832048-m02) DBG | domain multinode-832048-m02 has defined MAC address 52:54:00:dd:bd:dd in network mk-multinode-832048
	I0717 18:03:52.709586   47955 main.go:141] libmachine: (multinode-832048-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:dd:bd:dd", ip: ""} in network mk-multinode-832048: {Iface:virbr1 ExpiryTime:2024-07-17 19:02:06 +0000 UTC Type:0 Mac:52:54:00:dd:bd:dd Iaid: IPaddr:192.168.39.105 Prefix:24 Hostname:multinode-832048-m02 Clientid:01:52:54:00:dd:bd:dd}
	I0717 18:03:52.709607   47955 main.go:141] libmachine: (multinode-832048-m02) DBG | domain multinode-832048-m02 has defined IP address 192.168.39.105 and MAC address 52:54:00:dd:bd:dd in network mk-multinode-832048
	I0717 18:03:52.709805   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .GetSSHPort
	I0717 18:03:52.709952   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .GetSSHKeyPath
	I0717 18:03:52.710060   47955 main.go:141] libmachine: (multinode-832048-m02) Calling .GetSSHUsername
	I0717 18:03:52.710163   47955 sshutil.go:53] new ssh client: &{IP:192.168.39.105 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/19283-14409/.minikube/machines/multinode-832048-m02/id_rsa Username:docker}
	I0717 18:03:52.786202   47955 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0717 18:03:52.802699   47955 status.go:257] multinode-832048-m02 status: &{Name:multinode-832048-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0717 18:03:52.802757   47955 status.go:255] checking status of multinode-832048-m03 ...
	I0717 18:03:52.803102   47955 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:03:52.803157   47955 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:03:52.818309   47955 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43103
	I0717 18:03:52.818741   47955 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:03:52.819164   47955 main.go:141] libmachine: Using API Version  1
	I0717 18:03:52.819186   47955 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:03:52.819476   47955 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:03:52.819623   47955 main.go:141] libmachine: (multinode-832048-m03) Calling .GetState
	I0717 18:03:52.821108   47955 status.go:330] multinode-832048-m03 host status = "Stopped" (err=<nil>)
	I0717 18:03:52.821121   47955 status.go:343] host is not running, skipping remaining checks
	I0717 18:03:52.821127   47955 status.go:257] multinode-832048-m03 status: &{Name:multinode-832048-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.13s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (34.66s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 node start m03 -v=7 --alsologtostderr
multinode_test.go:282: (dbg) Done: out/minikube-linux-amd64 -p multinode-832048 node start m03 -v=7 --alsologtostderr: (34.06491143s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status -v=7 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (34.66s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (332.68s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-amd64 node list -p multinode-832048
multinode_test.go:321: (dbg) Run:  out/minikube-linux-amd64 stop -p multinode-832048
E0717 18:04:41.797559   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
multinode_test.go:321: (dbg) Done: out/minikube-linux-amd64 stop -p multinode-832048: (3m4.201357692s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-832048 --wait=true -v=8 --alsologtostderr
E0717 18:07:52.134572   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 18:09:41.797261   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
multinode_test.go:326: (dbg) Done: out/minikube-linux-amd64 start -p multinode-832048 --wait=true -v=8 --alsologtostderr: (2m28.39824923s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-amd64 node list -p multinode-832048
--- PASS: TestMultiNode/serial/RestartKeepsNodes (332.68s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (2.1s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-amd64 -p multinode-832048 node delete m03: (1.591173128s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (2.10s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (183.08s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 stop
E0717 18:12:35.191854   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
E0717 18:12:52.134741   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
multinode_test.go:345: (dbg) Done: out/minikube-linux-amd64 -p multinode-832048 stop: (3m2.91410826s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-832048 status: exit status 7 (79.074198ms)

                                                
                                                
-- stdout --
	multinode-832048
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-832048-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr: exit status 7 (82.359762ms)

                                                
                                                
-- stdout --
	multinode-832048
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-832048-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 18:13:05.303219   50723 out.go:291] Setting OutFile to fd 1 ...
	I0717 18:13:05.303475   50723 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 18:13:05.303485   50723 out.go:304] Setting ErrFile to fd 2...
	I0717 18:13:05.303489   50723 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 18:13:05.303659   50723 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 18:13:05.303834   50723 out.go:298] Setting JSON to false
	I0717 18:13:05.303862   50723 mustload.go:65] Loading cluster: multinode-832048
	I0717 18:13:05.303976   50723 notify.go:220] Checking for updates...
	I0717 18:13:05.304340   50723 config.go:182] Loaded profile config "multinode-832048": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 18:13:05.304359   50723 status.go:255] checking status of multinode-832048 ...
	I0717 18:13:05.304867   50723 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:13:05.304944   50723 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:13:05.323993   50723 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41101
	I0717 18:13:05.324418   50723 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:13:05.324925   50723 main.go:141] libmachine: Using API Version  1
	I0717 18:13:05.324957   50723 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:13:05.325278   50723 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:13:05.325448   50723 main.go:141] libmachine: (multinode-832048) Calling .GetState
	I0717 18:13:05.327054   50723 status.go:330] multinode-832048 host status = "Stopped" (err=<nil>)
	I0717 18:13:05.327067   50723 status.go:343] host is not running, skipping remaining checks
	I0717 18:13:05.327074   50723 status.go:257] multinode-832048 status: &{Name:multinode-832048 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0717 18:13:05.327114   50723 status.go:255] checking status of multinode-832048-m02 ...
	I0717 18:13:05.327410   50723 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0717 18:13:05.327451   50723 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0717 18:13:05.342227   50723 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42995
	I0717 18:13:05.342655   50723 main.go:141] libmachine: () Calling .GetVersion
	I0717 18:13:05.343199   50723 main.go:141] libmachine: Using API Version  1
	I0717 18:13:05.343220   50723 main.go:141] libmachine: () Calling .SetConfigRaw
	I0717 18:13:05.343539   50723 main.go:141] libmachine: () Calling .GetMachineName
	I0717 18:13:05.343736   50723 main.go:141] libmachine: (multinode-832048-m02) Calling .GetState
	I0717 18:13:05.345046   50723 status.go:330] multinode-832048-m02 host status = "Stopped" (err=<nil>)
	I0717 18:13:05.345059   50723 status.go:343] host is not running, skipping remaining checks
	I0717 18:13:05.345076   50723 status.go:257] multinode-832048-m02 status: &{Name:multinode-832048-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (183.08s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (107.29s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-832048 --wait=true -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd
E0717 18:14:41.796958   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
multinode_test.go:376: (dbg) Done: out/minikube-linux-amd64 start -p multinode-832048 --wait=true -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd: (1m46.771309466s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-amd64 -p multinode-832048 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (107.29s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (44.01s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-amd64 node list -p multinode-832048
multinode_test.go:464: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-832048-m02 --driver=kvm2  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p multinode-832048-m02 --driver=kvm2  --container-runtime=containerd: exit status 14 (60.969153ms)

                                                
                                                
-- stdout --
	* [multinode-832048-m02] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-832048-m02' is duplicated with machine name 'multinode-832048-m02' in profile 'multinode-832048'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-832048-m03 --driver=kvm2  --container-runtime=containerd
multinode_test.go:472: (dbg) Done: out/minikube-linux-amd64 start -p multinode-832048-m03 --driver=kvm2  --container-runtime=containerd: (42.908031707s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-amd64 node add -p multinode-832048
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-amd64 node add -p multinode-832048: exit status 80 (209.333232ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-832048 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-832048-m03 already exists in multinode-832048-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-amd64 delete -p multinode-832048-m03
--- PASS: TestMultiNode/serial/ValidateNameConflict (44.01s)

                                                
                                    
x
+
TestPreload (295.27s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-linux-amd64 start -p test-preload-129552 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.24.4
preload_test.go:44: (dbg) Done: out/minikube-linux-amd64 start -p test-preload-129552 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.24.4: (2m8.275251959s)
preload_test.go:52: (dbg) Run:  out/minikube-linux-amd64 -p test-preload-129552 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-linux-amd64 -p test-preload-129552 image pull gcr.io/k8s-minikube/busybox: (2.906148166s)
preload_test.go:58: (dbg) Run:  out/minikube-linux-amd64 stop -p test-preload-129552
E0717 18:17:52.134243   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
preload_test.go:58: (dbg) Done: out/minikube-linux-amd64 stop -p test-preload-129552: (1m31.635501349s)
preload_test.go:66: (dbg) Run:  out/minikube-linux-amd64 start -p test-preload-129552 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=kvm2  --container-runtime=containerd
E0717 18:19:24.849297   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 18:19:41.797398   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
preload_test.go:66: (dbg) Done: out/minikube-linux-amd64 start -p test-preload-129552 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=kvm2  --container-runtime=containerd: (1m11.405540712s)
preload_test.go:71: (dbg) Run:  out/minikube-linux-amd64 -p test-preload-129552 image list
helpers_test.go:175: Cleaning up "test-preload-129552" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p test-preload-129552
--- PASS: TestPreload (295.27s)

                                                
                                    
x
+
TestScheduledStopUnix (118.46s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-amd64 start -p scheduled-stop-699323 --memory=2048 --driver=kvm2  --container-runtime=containerd
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-amd64 start -p scheduled-stop-699323 --memory=2048 --driver=kvm2  --container-runtime=containerd: (46.890762552s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-699323 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-linux-amd64 status --format={{.TimeToStop}} -p scheduled-stop-699323 -n scheduled-stop-699323
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-699323 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-699323 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p scheduled-stop-699323 -n scheduled-stop-699323
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-linux-amd64 status -p scheduled-stop-699323
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-699323 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-linux-amd64 status -p scheduled-stop-699323
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-linux-amd64 status -p scheduled-stop-699323: exit status 7 (64.189454ms)

                                                
                                                
-- stdout --
	scheduled-stop-699323
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p scheduled-stop-699323 -n scheduled-stop-699323
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p scheduled-stop-699323 -n scheduled-stop-699323: exit status 7 (63.426493ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-699323" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p scheduled-stop-699323
--- PASS: TestScheduledStopUnix (118.46s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (177.14s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.26.0.1999805056 start -p running-upgrade-041659 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd
E0717 18:22:52.134210   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.26.0.1999805056 start -p running-upgrade-041659 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd: (2m2.400519569s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-amd64 start -p running-upgrade-041659 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
E0717 18:24:41.797429   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-amd64 start -p running-upgrade-041659 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (50.164773855s)
helpers_test.go:175: Cleaning up "running-upgrade-041659" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p running-upgrade-041659
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p running-upgrade-041659: (1.168251112s)
--- PASS: TestRunningBinaryUpgrade (177.14s)

                                                
                                    
x
+
TestKubernetesUpgrade (138.95s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
E0717 18:27:52.134261   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m1.829557167s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-amd64 stop -p kubernetes-upgrade-109027
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-amd64 stop -p kubernetes-upgrade-109027: (1.550925192s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-amd64 -p kubernetes-upgrade-109027 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-amd64 -p kubernetes-upgrade-109027 status --format={{.Host}}: exit status 7 (63.056457ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:243: (dbg) Done: out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (36.021541457s)
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-109027 version --output=json
version_upgrade_test.go:267: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:269: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.20.0 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:269: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.20.0 --driver=kvm2  --container-runtime=containerd: exit status 106 (88.499341ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-109027] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.31.0-beta.0 cluster to v1.20.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.20.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-109027
	    minikube start -p kubernetes-upgrade-109027 --kubernetes-version=v1.20.0
	    
	    2) Create a second cluster with Kubernetes 1.20.0, by running:
	    
	    minikube start -p kubernetes-upgrade-1090272 --kubernetes-version=v1.20.0
	    
	    3) Use the existing cluster at version Kubernetes 1.31.0-beta.0, by running:
	    
	    minikube start -p kubernetes-upgrade-109027 --kubernetes-version=v1.31.0-beta.0
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:273: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:275: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
E0717 18:29:15.192697   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
version_upgrade_test.go:275: (dbg) Done: out/minikube-linux-amd64 start -p kubernetes-upgrade-109027 --memory=2200 --kubernetes-version=v1.31.0-beta.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (38.407858558s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-109027" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p kubernetes-upgrade-109027
--- PASS: TestKubernetesUpgrade (138.95s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.08s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-971212 --no-kubernetes --kubernetes-version=1.20 --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p NoKubernetes-971212 --no-kubernetes --kubernetes-version=1.20 --driver=kvm2  --container-runtime=containerd: exit status 14 (78.280884ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-971212] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.08s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (159.71s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p old-k8s-version-373866 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p old-k8s-version-373866 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0: (2m39.709844719s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (159.71s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (97.29s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-971212 --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:95: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-971212 --driver=kvm2  --container-runtime=containerd: (1m37.024074774s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-linux-amd64 -p NoKubernetes-971212 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (97.29s)

                                                
                                    
x
+
TestNetworkPlugins/group/false (2.92s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false
net_test.go:246: (dbg) Run:  out/minikube-linux-amd64 start -p false-925288 --memory=2048 --alsologtostderr --cni=false --driver=kvm2  --container-runtime=containerd
net_test.go:246: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p false-925288 --memory=2048 --alsologtostderr --cni=false --driver=kvm2  --container-runtime=containerd: exit status 14 (103.569823ms)

                                                
                                                
-- stdout --
	* [false-925288] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=19283
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on user configuration
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0717 18:24:01.983509   56754 out.go:291] Setting OutFile to fd 1 ...
	I0717 18:24:01.983649   56754 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 18:24:01.983660   56754 out.go:304] Setting ErrFile to fd 2...
	I0717 18:24:01.983667   56754 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0717 18:24:01.983928   56754 root.go:338] Updating PATH: /home/jenkins/minikube-integration/19283-14409/.minikube/bin
	I0717 18:24:01.984692   56754 out.go:298] Setting JSON to false
	I0717 18:24:01.985981   56754 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-5","uptime":7585,"bootTime":1721233057,"procs":208,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1062-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0717 18:24:01.986065   56754 start.go:139] virtualization: kvm guest
	I0717 18:24:01.988616   56754 out.go:177] * [false-925288] minikube v1.33.1 on Ubuntu 20.04 (kvm/amd64)
	I0717 18:24:01.990112   56754 out.go:177]   - MINIKUBE_LOCATION=19283
	I0717 18:24:01.990111   56754 notify.go:220] Checking for updates...
	I0717 18:24:01.991534   56754 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0717 18:24:01.992846   56754 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/19283-14409/kubeconfig
	I0717 18:24:01.994246   56754 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/19283-14409/.minikube
	I0717 18:24:01.995662   56754 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0717 18:24:01.997220   56754 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0717 18:24:01.999128   56754 config.go:182] Loaded profile config "NoKubernetes-971212": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.30.2
	I0717 18:24:01.999259   56754 config.go:182] Loaded profile config "old-k8s-version-373866": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.20.0
	I0717 18:24:01.999367   56754 config.go:182] Loaded profile config "running-upgrade-041659": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.24.1
	I0717 18:24:01.999467   56754 driver.go:392] Setting default libvirt URI to qemu:///system
	I0717 18:24:02.035730   56754 out.go:177] * Using the kvm2 driver based on user configuration
	I0717 18:24:02.037049   56754 start.go:297] selected driver: kvm2
	I0717 18:24:02.037064   56754 start.go:901] validating driver "kvm2" against <nil>
	I0717 18:24:02.037076   56754 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0717 18:24:02.039294   56754 out.go:177] 
	W0717 18:24:02.041039   56754 out.go:239] X Exiting due to MK_USAGE: The "containerd" container runtime requires CNI
	X Exiting due to MK_USAGE: The "containerd" container runtime requires CNI
	I0717 18:24:02.042348   56754 out.go:177] 

                                                
                                                
** /stderr **
net_test.go:88: 
----------------------- debugLogs start: false-925288 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "false-925288" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt
extensions:
- extension:
last-update: Wed, 17 Jul 2024 18:23:44 UTC
provider: minikube.sigs.k8s.io
version: v1.33.1
name: cluster_info
server: https://192.168.39.77:8443
name: old-k8s-version-373866
contexts:
- context:
cluster: old-k8s-version-373866
extensions:
- extension:
last-update: Wed, 17 Jul 2024 18:23:44 UTC
provider: minikube.sigs.k8s.io
version: v1.33.1
name: context_info
namespace: default
user: old-k8s-version-373866
name: old-k8s-version-373866
current-context: ""
kind: Config
preferences: {}
users:
- name: old-k8s-version-373866
user:
client-certificate: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt
client-key: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: false-925288

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "false-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-925288"

                                                
                                                
----------------------- debugLogs end: false-925288 [took: 2.677354758s] --------------------------------
helpers_test.go:175: Cleaning up "false-925288" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p false-925288
--- PASS: TestNetworkPlugins/group/false (2.92s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (38.81s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-971212 --no-kubernetes --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-971212 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (37.785538396s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-linux-amd64 -p NoKubernetes-971212 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-linux-amd64 -p NoKubernetes-971212 status -o json: exit status 2 (219.577294ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-971212","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-linux-amd64 delete -p NoKubernetes-971212
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (38.81s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (38.03s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-971212 --no-kubernetes --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:136: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-971212 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (38.026502755s)
--- PASS: TestNoKubernetes/serial/Start (38.03s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (10.59s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-373866 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [12c6a52f-6caa-4012-83da-4f5b5cc5c266] Pending
helpers_test.go:344: "busybox" [12c6a52f-6caa-4012-83da-4f5b5cc5c266] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [12c6a52f-6caa-4012-83da-4f5b5cc5c266] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 10.004366939s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-373866 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (10.59s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (1.2s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p old-k8s-version-373866 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p old-k8s-version-373866 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.0902111s)
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-373866 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (1.20s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (91.89s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p old-k8s-version-373866 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p old-k8s-version-373866 --alsologtostderr -v=3: (1m31.891450908s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (91.89s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.2s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-linux-amd64 ssh -p NoKubernetes-971212 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-linux-amd64 ssh -p NoKubernetes-971212 "sudo systemctl is-active --quiet service kubelet": exit status 1 (198.062583ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.20s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.18s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p old-k8s-version-373866 -n old-k8s-version-373866
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p old-k8s-version-373866 -n old-k8s-version-373866: exit status 7 (65.513887ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p old-k8s-version-373866 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.18s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (563.64s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p old-k8s-version-373866 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p old-k8s-version-373866 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0: (9m23.379317592s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p old-k8s-version-373866 -n old-k8s-version-373866
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (563.64s)

                                                
                                    
x
+
TestPause/serial/Start (73.26s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-amd64 start -p pause-244300 --memory=2048 --install-addons=false --wait=all --driver=kvm2  --container-runtime=containerd
pause_test.go:80: (dbg) Done: out/minikube-linux-amd64 start -p pause-244300 --memory=2048 --install-addons=false --wait=all --driver=kvm2  --container-runtime=containerd: (1m13.260182976s)
--- PASS: TestPause/serial/Start (73.26s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (41.17s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-amd64 start -p pause-244300 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-amd64 start -p pause-244300 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (41.142877627s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (41.17s)

                                                
                                    
x
+
TestPause/serial/Pause (0.68s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-amd64 pause -p pause-244300 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.68s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.26s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-amd64 status -p pause-244300 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-amd64 status -p pause-244300 --output=json --layout=cluster: exit status 2 (264.390402ms)

                                                
                                                
-- stdout --
	{"Name":"pause-244300","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 6 containers in: kube-system, kubernetes-dashboard, storage-gluster, istio-operator","BinaryVersion":"v1.33.1","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-244300","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.26s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.79s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-amd64 unpause -p pause-244300 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.79s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (1.03s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-amd64 pause -p pause-244300 --alsologtostderr -v=5
pause_test.go:110: (dbg) Done: out/minikube-linux-amd64 pause -p pause-244300 --alsologtostderr -v=5: (1.030562436s)
--- PASS: TestPause/serial/PauseAgain (1.03s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (1.09s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-amd64 delete -p pause-244300 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-amd64 delete -p pause-244300 --alsologtostderr -v=5: (1.094698526s)
--- PASS: TestPause/serial/DeletePaused (1.09s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.55s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestPause/serial/VerifyDeletedResources (0.55s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (100.44s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p no-preload-770827 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p no-preload-770827 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0: (1m40.440738324s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (100.44s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (97.72s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p embed-certs-399640 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p embed-certs-399640 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2: (1m37.719091279s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (97.72s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/FirstStart (85.23s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p default-k8s-diff-port-056265 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p default-k8s-diff-port-056265 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2: (1m25.234716979s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/FirstStart (85.23s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (9.31s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-770827 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [7de24dfd-49e0-4260-aa6d-a213cacd112f] Pending
helpers_test.go:344: "busybox" [7de24dfd-49e0-4260-aa6d-a213cacd112f] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [7de24dfd-49e0-4260-aa6d-a213cacd112f] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 9.009488302s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-770827 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (9.31s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (1.07s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p no-preload-770827 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-770827 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (1.07s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (91.64s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p no-preload-770827 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p no-preload-770827 --alsologtostderr -v=3: (1m31.643244602s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (91.64s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/DeployApp (10.26s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-056265 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [fec4f43b-15d9-45ff-b2b7-6463fdfeed50] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [fec4f43b-15d9-45ff-b2b7-6463fdfeed50] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: integration-test=busybox healthy within 10.003719384s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-056265 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/DeployApp (10.26s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (10.27s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-399640 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [d1e2797d-dc81-4463-900c-a6da589b96d9] Pending
helpers_test.go:344: "busybox" [d1e2797d-dc81-4463-900c-a6da589b96d9] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [d1e2797d-dc81-4463-900c-a6da589b96d9] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 10.003425125s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-399640 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (10.27s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (1.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p default-k8s-diff-port-056265 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-diff-port-056265 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (1.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Stop (91.63s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p default-k8s-diff-port-056265 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p default-k8s-diff-port-056265 --alsologtostderr -v=3: (1m31.630528408s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Stop (91.63s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.95s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p embed-certs-399640 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-399640 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.95s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (91.62s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p embed-certs-399640 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p embed-certs-399640 --alsologtostderr -v=3: (1m31.616534861s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (91.62s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p no-preload-770827 -n no-preload-770827
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p no-preload-770827 -n no-preload-770827: exit status 7 (64.457838ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p no-preload-770827 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.17s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (309.32s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p no-preload-770827 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0
E0717 18:32:52.134650   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/addons-566926/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p no-preload-770827 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0: (5m9.056041968s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p no-preload-770827 -n no-preload-770827
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (309.32s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265: exit status 7 (60.585089ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p default-k8s-diff-port-056265 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.17s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (317.04s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p default-k8s-diff-port-056265 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p default-k8s-diff-port-056265 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2: (5m16.80412042s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/SecondStart (317.04s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.18s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p embed-certs-399640 -n embed-certs-399640
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p embed-certs-399640 -n embed-certs-399640: exit status 7 (67.261076ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p embed-certs-399640 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.18s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (311.2s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p embed-certs-399640 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2
E0717 18:34:41.797588   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
E0717 18:36:04.850244   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p embed-certs-399640 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.30.2: (5m10.941084143s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p embed-certs-399640 -n embed-certs-399640
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (311.20s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-cd95d586-vf88l" [7a0c41f5-0f34-494a-baa3-d1b541840afa] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.00488343s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (6.08s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-cd95d586-vf88l" [7a0c41f5-0f34-494a-baa3-d1b541840afa] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.004063687s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-373866 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (6.08s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.22s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p old-k8s-version-373866 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20210326-1e038dc5
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.22s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (2.54s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p old-k8s-version-373866 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p old-k8s-version-373866 -n old-k8s-version-373866
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p old-k8s-version-373866 -n old-k8s-version-373866: exit status 2 (237.100383ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p old-k8s-version-373866 -n old-k8s-version-373866
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p old-k8s-version-373866 -n old-k8s-version-373866: exit status 2 (232.993521ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p old-k8s-version-373866 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p old-k8s-version-373866 -n old-k8s-version-373866
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p old-k8s-version-373866 -n old-k8s-version-373866
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (2.54s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (51.6s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p newest-cni-985555 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p newest-cni-985555 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0: (51.601000132s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (51.60s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (1.13s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p newest-cni-985555 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p newest-cni-985555 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.130177458s)
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (1.13s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (91.72s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p newest-cni-985555 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p newest-cni-985555 --alsologtostderr -v=3: (1m31.7192467s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (91.72s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (9.02s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-5cc9f66cf4-d8dj8" [be31fd64-7da8-4519-a36a-df9ec2ec7d62] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
helpers_test.go:344: "kubernetes-dashboard-5cc9f66cf4-d8dj8" [be31fd64-7da8-4519-a36a-df9ec2ec7d62] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 9.016118218s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (9.02s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (6.08s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-5cc9f66cf4-d8dj8" [be31fd64-7da8-4519-a36a-df9ec2ec7d62] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.004397805s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-770827 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (6.08s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.23s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p no-preload-770827 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.23s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (2.77s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p no-preload-770827 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p no-preload-770827 -n no-preload-770827
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p no-preload-770827 -n no-preload-770827: exit status 2 (255.002584ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p no-preload-770827 -n no-preload-770827
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p no-preload-770827 -n no-preload-770827: exit status 2 (257.828854ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p no-preload-770827 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p no-preload-770827 -n no-preload-770827
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p no-preload-770827 -n no-preload-770827
--- PASS: TestStartStop/group/no-preload/serial/Pause (2.77s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (3.14s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (3.14s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (143.39s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.26.0.934343620 start -p stopped-upgrade-955760 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.26.0.934343620 start -p stopped-upgrade-955760 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd: (53.068684798s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.26.0.934343620 -p stopped-upgrade-955760 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.26.0.934343620 -p stopped-upgrade-955760 stop: (1.225296345s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-amd64 start -p stopped-upgrade-955760 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-amd64 start -p stopped-upgrade-955760 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m29.100134117s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (143.39s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (12.31s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-779776cb65-h5df7" [7957da61-37df-4af4-aecc-01380d12bb05] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
helpers_test.go:344: "kubernetes-dashboard-779776cb65-h5df7" [7957da61-37df-4af4-aecc-01380d12bb05] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 12.3136074s
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (12.31s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-779776cb65-xvr6c" [08e7665f-d292-4de9-8765-7dcd622f09b2] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.004579987s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.07s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-779776cb65-xvr6c" [08e7665f-d292-4de9-8765-7dcd622f09b2] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.006298585s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-399640 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.07s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (6.08s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-779776cb65-h5df7" [7957da61-37df-4af4-aecc-01380d12bb05] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.004815955s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-diff-port-056265 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (6.08s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.33s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p embed-certs-399640 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20240513-cd2ac642
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.33s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (2.94s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p embed-certs-399640 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Done: out/minikube-linux-amd64 pause -p embed-certs-399640 --alsologtostderr -v=1: (1.087024303s)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p embed-certs-399640 -n embed-certs-399640
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p embed-certs-399640 -n embed-certs-399640: exit status 2 (240.995506ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p embed-certs-399640 -n embed-certs-399640
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p embed-certs-399640 -n embed-certs-399640: exit status 2 (230.552503ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p embed-certs-399640 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p embed-certs-399640 -n embed-certs-399640
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p embed-certs-399640 -n embed-certs-399640
--- PASS: TestStartStop/group/embed-certs/serial/Pause (2.94s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (59.62s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p auto-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p auto-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=kvm2  --container-runtime=containerd: (59.621466136s)
--- PASS: TestNetworkPlugins/group/auto/Start (59.62s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.22s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p default-k8s-diff-port-056265 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20240513-cd2ac642
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.22s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Pause (2.75s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p default-k8s-diff-port-056265 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265: exit status 2 (238.075538ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265: exit status 2 (245.593935ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p default-k8s-diff-port-056265 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-056265 -n default-k8s-diff-port-056265
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Pause (2.75s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (93.88s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p kindnet-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p kindnet-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=kvm2  --container-runtime=containerd: (1m33.877709935s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (93.88s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p newest-cni-985555 -n newest-cni-985555
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p newest-cni-985555 -n newest-cni-985555: exit status 7 (62.688692ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p newest-cni-985555 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.17s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (77.69s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p newest-cni-985555 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p newest-cni-985555 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.31.0-beta.0: (1m17.251698924s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p newest-cni-985555 -n newest-cni-985555
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (77.69s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.24s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p auto-925288 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.24s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (10.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-qff9f" [bced38dd-0cec-4cec-8031-86234af531ca] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-qff9f" [bced38dd-0cec-4cec-8031-86234af531ca] Running
E0717 18:39:41.797465   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/functional-142583/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 10.005173318s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (10.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (33.39s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-925288 exec deployment/netcat -- nslookup kubernetes.default
net_test.go:175: (dbg) Non-zero exit: kubectl --context auto-925288 exec deployment/netcat -- nslookup kubernetes.default: exit status 1 (15.168990341s)

                                                
                                                
-- stdout --
	;; connection timed out; no servers could be reached
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:175: (dbg) Run:  kubectl --context auto-925288 exec deployment/netcat -- nslookup kubernetes.default
E0717 18:40:12.018341   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.023747   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.034082   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.054405   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.094711   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.175088   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.335466   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:12.655887   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:40:13.296425   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
net_test.go:175: (dbg) Non-zero exit: kubectl --context auto-925288 exec deployment/netcat -- nslookup kubernetes.default: exit status 1 (15.158060288s)

                                                
                                                
-- stdout --
	;; connection timed out; no servers could be reached
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0717 18:40:17.138364   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
net_test.go:175: (dbg) Run:  kubectl --context auto-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (33.39s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:344: "kindnet-d44jm" [bf7725b2-6a5a-478a-b5a5-3351ddf0c57a] Running
E0717 18:40:14.577405   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 6.007910802s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p newest-cni-985555 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20240715-f6ad1f6e
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.27s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (3.59s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p newest-cni-985555 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p newest-cni-985555 -n newest-cni-985555
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p newest-cni-985555 -n newest-cni-985555: exit status 2 (266.479595ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p newest-cni-985555 -n newest-cni-985555
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p newest-cni-985555 -n newest-cni-985555: exit status 2 (287.752753ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p newest-cni-985555 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p newest-cni-985555 -n newest-cni-985555
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p newest-cni-985555 -n newest-cni-985555
--- PASS: TestStartStop/group/newest-cni/serial/Pause (3.59s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.16s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (1.14s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-amd64 logs -p stopped-upgrade-955760
version_upgrade_test.go:206: (dbg) Done: out/minikube-linux-amd64 logs -p stopped-upgrade-955760: (1.137184363s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (1.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p kindnet-925288 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (10.33s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-zt47r" [b568436c-bb7b-48dc-86b5-9191a80c9732] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-zt47r" [b568436c-bb7b-48dc-86b5-9191a80c9732] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 10.004263963s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (10.33s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (96.08s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p calico-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=kvm2  --container-runtime=containerd
E0717 18:40:22.258806   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p calico-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=kvm2  --container-runtime=containerd: (1m36.082672462s)
--- PASS: TestNetworkPlugins/group/calico/Start (96.08s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (117.49s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p custom-flannel-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p custom-flannel-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=kvm2  --container-runtime=containerd: (1m57.486345255s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (117.49s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (109.34s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p enable-default-cni-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=kvm2  --container-runtime=containerd
E0717 18:40:41.911775   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:41.917019   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:41.927280   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:41.947530   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:41.987803   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:42.068544   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:42.229009   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:42.550139   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:43.191162   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:44.471605   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p enable-default-cni-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=kvm2  --container-runtime=containerd: (1m49.335299879s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (109.34s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (141.51s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p flannel-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=kvm2  --container-runtime=containerd
E0717 18:40:47.032341   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:52.153092   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:40:52.980097   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:41:02.393728   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:41:17.309337   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.314624   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.324869   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.345122   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.385443   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.465744   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.626442   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:17.947028   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:18.588131   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:19.868685   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:22.429424   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:22.874097   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
E0717 18:41:27.550057   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
E0717 18:41:33.940505   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt: no such file or directory
E0717 18:41:37.790976   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p flannel-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=kvm2  --container-runtime=containerd: (2m21.513244945s)
--- PASS: TestNetworkPlugins/group/flannel/Start (141.51s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:344: "calico-node-7pn2t" [ff320859-9782-41ac-80c4-a55f78bea45d] Running
E0717 18:41:58.271636   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/default-k8s-diff-port-056265/client.crt: no such file or directory
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 6.005707328s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p calico-925288 "pgrep -a kubelet"
E0717 18:42:03.834848   21661 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/no-preload-770827/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (9.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-sszpl" [b75c3fd8-9092-4e65-a288-54fb639ea80f] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-sszpl" [b75c3fd8-9092-4e65-a288-54fb639ea80f] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 9.00527337s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (9.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.24s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p custom-flannel-925288 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.24s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (10.28s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-n9v4w" [1f6357d8-b37f-491b-8080-af7b6dc31c6f] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-n9v4w" [1f6357d8-b37f-491b-8080-af7b6dc31c6f] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 10.004919718s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (10.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.27s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p enable-default-cni-925288 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.27s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (11.3s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-bbw5t" [027dc928-c6f7-4b9b-85ce-13f1a859966d] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-bbw5t" [027dc928-c6f7-4b9b-85ce-13f1a859966d] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 11.005492384s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (11.30s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (64.44s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p bridge-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p bridge-925288 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=kvm2  --container-runtime=containerd: (1m4.443179576s)
--- PASS: TestNetworkPlugins/group/bridge/Start (64.44s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-flannel" ...
helpers_test.go:344: "kube-flannel-ds-wpxhn" [0283e9b3-1421-4f7a-8a16-edbf4924c997] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 6.005649311s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p flannel-925288 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (10.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context flannel-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-plz74" [ff45656d-1f06-456c-896e-69434ef553ba] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-plz74" [ff45656d-1f06-456c-896e-69434ef553ba] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 10.003922879s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (10.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context flannel-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context flannel-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context flannel-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p bridge-925288 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (9.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-925288 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-q8vgk" [b275ea7a-da2a-431c-878b-928b5ee1cf6f] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-q8vgk" [b275ea7a-da2a-431c-878b-928b5ee1cf6f] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 9.004540764s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (9.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-925288 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-925288 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                    

Test skip (39/327)

Order skiped test Duration
5 TestDownloadOnly/v1.20.0/cached-images 0
6 TestDownloadOnly/v1.20.0/binaries 0
7 TestDownloadOnly/v1.20.0/kubectl 0
14 TestDownloadOnly/v1.30.2/cached-images 0
15 TestDownloadOnly/v1.30.2/binaries 0
16 TestDownloadOnly/v1.30.2/kubectl 0
23 TestDownloadOnly/v1.31.0-beta.0/cached-images 0
24 TestDownloadOnly/v1.31.0-beta.0/binaries 0
25 TestDownloadOnly/v1.31.0-beta.0/kubectl 0
29 TestDownloadOnlyKic 0
43 TestAddons/parallel/Olm 0
57 TestDockerFlags 0
60 TestDockerEnvContainerd 0
62 TestHyperKitDriverInstallOrUpdate 0
63 TestHyperkitDriverSkipUpgrade 0
114 TestFunctional/parallel/DockerEnv 0
115 TestFunctional/parallel/PodmanEnv 0
152 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.01
153 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.01
154 TestFunctional/parallel/TunnelCmd/serial/WaitService 0.01
155 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.01
156 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.01
157 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.01
158 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.01
159 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.01
163 TestGvisorAddon 0
181 TestImageBuild 0
208 TestKicCustomNetwork 0
209 TestKicExistingNetwork 0
210 TestKicCustomSubnet 0
211 TestKicStaticIP 0
243 TestChangeNoneUser 0
246 TestScheduledStopWindows 0
248 TestSkaffold 0
250 TestInsufficientStorage 0
254 TestMissingContainerUpgrade 0
262 TestStartStop/group/disable-driver-mounts 0.14
269 TestNetworkPlugins/group/kubenet 3.06
277 TestNetworkPlugins/group/cilium 3.21
x
+
TestDownloadOnly/v1.20.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.20.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.20.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/kubectl
aaa_download_only_test.go:167: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.20.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.30.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.30.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/kubectl
aaa_download_only_test.go:167: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.30.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.31.0-beta.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.31.0-beta.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.31.0-beta.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.31.0-beta.0/kubectl
aaa_download_only_test.go:167: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.31.0-beta.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:220: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:500: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd false linux amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:105: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:169: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:459: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/WaitService (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.01s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:284: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.14s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-009208" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p disable-driver-mounts-009208
--- SKIP: TestStartStop/group/disable-driver-mounts (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet (3.06s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet
net_test.go:93: Skipping the test as containerd container runtimes requires CNI
panic.go:626: 
----------------------- debugLogs start: kubenet-925288 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "kubenet-925288" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt
extensions:
- extension:
last-update: Wed, 17 Jul 2024 18:23:44 UTC
provider: minikube.sigs.k8s.io
version: v1.33.1
name: cluster_info
server: https://192.168.39.77:8443
name: old-k8s-version-373866
contexts:
- context:
cluster: old-k8s-version-373866
extensions:
- extension:
last-update: Wed, 17 Jul 2024 18:23:44 UTC
provider: minikube.sigs.k8s.io
version: v1.33.1
name: context_info
namespace: default
user: old-k8s-version-373866
name: old-k8s-version-373866
current-context: ""
kind: Config
preferences: {}
users:
- name: old-k8s-version-373866
user:
client-certificate: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt
client-key: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: kubenet-925288

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "kubenet-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-925288"

                                                
                                                
----------------------- debugLogs end: kubenet-925288 [took: 2.91187281s] --------------------------------
helpers_test.go:175: Cleaning up "kubenet-925288" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p kubenet-925288
--- SKIP: TestNetworkPlugins/group/kubenet (3.06s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (3.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:626: 
----------------------- debugLogs start: cilium-925288 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-925288" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/19283-14409/.minikube/ca.crt
extensions:
- extension:
last-update: Wed, 17 Jul 2024 18:23:44 UTC
provider: minikube.sigs.k8s.io
version: v1.33.1
name: cluster_info
server: https://192.168.39.77:8443
name: old-k8s-version-373866
contexts:
- context:
cluster: old-k8s-version-373866
extensions:
- extension:
last-update: Wed, 17 Jul 2024 18:23:44 UTC
provider: minikube.sigs.k8s.io
version: v1.33.1
name: context_info
namespace: default
user: old-k8s-version-373866
name: old-k8s-version-373866
current-context: ""
kind: Config
preferences: {}
users:
- name: old-k8s-version-373866
user:
client-certificate: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.crt
client-key: /home/jenkins/minikube-integration/19283-14409/.minikube/profiles/old-k8s-version-373866/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-925288

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-925288" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-925288"

                                                
                                                
----------------------- debugLogs end: cilium-925288 [took: 3.037490794s] --------------------------------
helpers_test.go:175: Cleaning up "cilium-925288" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p cilium-925288
--- SKIP: TestNetworkPlugins/group/cilium (3.21s)

                                                
                                    
Copied to clipboard